Mistral debuts Mistral Small 3.1, a 24B-parameter multimodal and multilingual open-source model it says outperforms Gemma 3 and GPT-4o-mini and runs on 32GB RAM
SOTA. Multimodal. Multilingual. Apache 2.0 — Research Hugging Face : Mistral-Small-3.1-24B-Base-2503 like 41 — Mistral AI_ 6.32k — Model Card for Mistral-Small-3.1-24B-Base-2503 Google Cloud ...
Mistral releases Mistral Saba, a 24B-parameter custom-trained model focused on Arabic language and culture, via its API; Saba outperforms Mistral Small 3
One of the many custom-trained models to serve specific geographies, markets, and customers X: Sophia Yang, Ph.D. / @sophiamyang : 🏟️Announcing @MistralAI Saba, our first regional language model. - Mi...
Mistral adds web search, image generation, Canvas for editing, and more to its Le Chat chatbot, and unveils Pixtral Large, a 124B-parameter multimodal model
Pixtral grows up. — Pixtral Large in short: — Frontier-class multimodal performance Mistral AI : Mistral has entered the chat Markus Kasanmascheff / WinBuzzer : Mistral Goes Multimodal to Challenge...
Mistral AI releases Mistral Large, a cheaper GPT-4 rival that supports 32K-token context windows, and Le Chat, a ChatGPT-like chat assistant in public beta
Mistral Large is our flagship model, with top-tier reasoning capacities. Brad Smith / Microsoft On the Issues : Microsoft's AI Access Principles: Our commitments to promote innovation and competition ...