AI company Mistral has introduced Ministral 3B and 8B models for on-device computing, excelling in efficiency and reasoning. They support 128k context length and aim for low-latency, privacy-focused applications.
https://alternativeto.net/news/2024/10/introducing-les-ministraux--mistral-s-new-ai-models-for-efficient-on-device-computing/