New release of my LLM combined CLI tool and Python library for interacting with LLMs - the big new feature in 0.18 is support for async models https://llm.datasette.io/en/stable/changelog.html#v0-18
New release of my LLM combined CLI tool and Python library for interacting with LLMs - the big new feature in 0.18 is support for async models https://llm.datasette.io/en/stable/changelog.html#v0-18 6 comments
And here's llm-gemini 0.4, adding asynchronous support for the Google Gemini models plus the new Chatbot Arena-topping gemini-exp-1114 (and -o json_object 1 JSON mode for good measure) https://github.com/simonw/llm-gemini/releases/tag/0.4 llm-mistral 0.8 adds async model support for Mistral models, including the new Pixtral Large https://github.com/simonw/llm-mistral/releases/tag/0.8 @simon yes, it's this one: https://en.wikipedia.org/wiki/Mistral_(wind) But it is persistent (unrelenting?). That's why it is not just vent/wind. Like, it will still take your hat off tomorrow. Not a French word, really. It's something I would hear old guys complaining about in Provençal when I was very young there. In the context of an AI company, I take it to mean “progress will happen” |
And a new plugin release: llm-claude-3 0.9, adding support for asynchronous access to the Claude family of models https://github.com/simonw/llm-claude-3/releases/tag/0.9