Hosting your own #LLM is surprisingly easy.
https://garrit.xyz/posts/2024-06-17-host-your-own-llm
@garritfra It's even easier with Mozilla's llamafile. Just download a single binary with an embedded model and run it (also includes web interface).https://llamafile.ai/
@garritfra It's even easier with Mozilla's llamafile. Just download a single binary with an embedded model and run it (also includes web interface).
https://llamafile.ai/