AI company Mistral has introduced Ministral 3B and 8B models for on-device computing, excelling in efficiency and reasoning. They support 128k context length and aim for low-latency, privacy-focused applications.
alternativeto.net/news/2024/10