2025-08-29
I'm accepting and will be to blame for your AI coding models swearing and trying to make systems like it's 1999 [embedded post]
Anthropic
Anthropic requires users to accept new terms by September 28, including choosing whether new chats and coding sessions can be used to train AI models
by default for consumer versions … John Dolman : So the last bastion of not training on your data as a default falls. — Whilst I understand the use of user interaction to refine ...
2024-12-21
who could have thought that BIGGER COMPUTER AND MORE DATA wasnt the solution, nobody ever said tha.. [taps Searle, Dreyfus etc sign] [embedded post]
Wall Street Journal
Sources: OpenAI's GPT-5, codenamed Orion, is behind schedule and faces technical hurdles, including high computing costs and limited high-quality training data
OpenAI has run into problem after problem on its new artificial-intelligence project, code-named Orion Bluesky: @allytibbitt.me , @tomashirstecon , @columnist , @madamehardy , @daw...
2024-12-15
Honestly much more interested in small specialized models than big all purpose models. They so far have not really been that useful. [embedded post]
Nature
AI companies, running out of conventional training datasets from the web, may be forced to shift from big, all-purpose LLMs to smaller, more specialized models
why human-sourced data can help prevent AI model collapse Matthias Bastian / The Decoder : OpenAI co-founder says AI is reaching “peak data” as it hits the limits of the internet K...