Sarvam AI's 24B-parameter LLM for Indian languages Sarvam-M receives only 334 downloads in two days on Hugging Face, raising concerns about AI efforts in India
Model Information sarvam-m is a multilingual, hybrid-reasoning, text-only language model built on Mistral-Small. The Indian Express : Sarvam AI debuts flagship open-source LLM with 24 billion paramete...
Meta VP of Generative AI Ahmad Al-Dahle denies a rumor that the company trained Llama 4 Maverick and Scout on test sets, saying that Meta “would never do that”
but the EU doesn't get everything Pascale Davies / Euronews : From a political shift to a more powerful AI: Everything to know about Meta's Llama 4 models Jay Bonggolto / Android Central : Meta is com...
Mistral debuts Mistral Small 3.1, a 24B-parameter multimodal and multilingual open-source model it says outperforms Gemma 3 and GPT-4o-mini and runs on 32GB RAM
SOTA. Multimodal. Multilingual. Apache 2.0 — Research Hugging Face : Mistral-Small-3.1-24B-Base-2503 like 41 — Mistral AI_ 6.32k — Model Card for Mistral-Small-3.1-24B-Base-2503 Google Cloud ...
Mistral AI releases Mistral Large, a cheaper GPT-4 rival that supports 32K-token context windows, and Le Chat, a ChatGPT-like chat assistant in public beta
Mistral Large is our flagship model, with top-tier reasoning capacities. Brad Smith / Microsoft On the Issues : Microsoft's AI Access Principles: Our commitments to promote innovation and competition ...