Email or username:

Password:

Forgot your password?
Top-level
Simon Willison

@prem_k ollama? Yeah it’s pretty good, they’re very on top of adding new models

5 comments
Prem Kumar Aparanji 👶🤖🐘

@simon no, wllama is the WASM binding for llama.cpp & can Inference the gguf files of the models within the browser itself. It is an alternative to Transformers.js (onnx files) and WebLLM (bin shards).

github.com/ngxson/wllama

Simon Willison

@prem_k oh fantastic! I’ve played with github.com/mlc-ai/web-llm but I didn’t know about the llama.cpp port, that’s awesome

Prem Kumar Aparanji 👶🤖🐘

@simon #WASM is such an interesting development for web apps that can run locally in the browser, even when offline. 😃

Slightly related note, Motherduck's 1.5 tier architecture powered by WASM is pretty cool too, especially when are able to join between tables on your browser and in your cloud in a single SQL query.

Wonder what else will WASM bring. 🤞🏼

Simon Willison

@prem_k I love how easy WASM makes it to run tools like Tesseract - I built this OCR tool using Tesseract.js and PDF.js and it works really well tools.simonwillison.net/ocr

Prem Kumar Aparanji 👶🤖🐘

@simon wow! Didn't know about tesseract.js.

This could potentially remove the need for RPA 😄

Go Up