2025-11-27
Huge wave is coming that will make every past technological shift look microscopic. “Massachusetts Institute of Technology on Wednesday released a study that found that artificial intelligence can already replace 11.7% of the U.S. labor market, or as much as $1.2 trillion in [image]
CNBC
An MIT study finds that AI can replace 11.7% of the US labor market, or ~$1.2T in wages, based on the “Iceberg Index”, which measures job automation potential
Massachusetts Institute of Technology on Wednesday released a study that found that artificial intelligence …
2025-10-30
10,000x more energy efficient AI soon? Extropic claims that running Diffusion Transition Models (DTMs) on their Thermodynamic State Units (TSUs) could make generative AI up to 10,000x more energy efficient than today's GPU based methods, according to their simulations. [image]
Wired
Extropic, which says its chips using probabilistic bits can be 10,000x more energy efficient than current AI chips, shares its first chip with some AI labs
A startup hopes to challenge Nvidia, AMD, and Intel with a chip that wrangles probabilities rather than 1s and 0s.
2025-10-21
Big AI progress “DeepSeek figured out how to get 10x better compression using vision tokens than with text tokens. So you could theoretically store those 10k words in just 1,500 of their special compressed visual tokens.”
The Decoder
DeepSeek releases DeepSeek-OCR, a vision language model designed for efficient vision-text compression, enabling longer contexts with less compute
the new frontier of OCR from @deepseek_ai , exploring optical context compression for LLMs, is running blazingly fast on vLLM ⚡ (~2500 tokens/s on A100-40G) — powered by vllm==0.8....
2025-10-16
This is pretty insane. Probably the biggest AI related news of the year so far Major breakthrough in AI powered biology! Google just announced Cell2Sentence-Scale 27B (C2S-Scale), a 27 billion parameter model built to understand the language of individual cells. Trained in [image]
The Keyword
Google releases Cell2Sentence-Scale 27B (C2S-Scale), a 27B-parameter foundation model for single-cell analysis built on its Gemma family of open models
We're launching a new 27 billion parameter foundation model for single-cell analysis built on the Gemma family of open models.
2025-10-09
This is insane. New AI model from Samsung, 10,000x smaller than DeepSeek and Gemini 2.5 Pro just beat them on ARC-AGI 1 and 2 Samsung's Tiny Recursive Model (TRM) is about 10,000x smaller than typical LLMs yet smarter because it thinks recursively instead of just predicting [image]
VentureBeat
Samsung introduces the Tiny Recursion Model, a 7M-parameter model that can outperform LLMs 10,000x larger, like Gemini 2.5 Pro and o3-mini, on specific problems
The trend of AI researchers developing new, small open source generative models that outperform far larger …