Mira Murati's Thinking Machines Lab makes Tinker, its API for fine-tuning language models, generally available, adds support for Kimi K2 Thinking, and more
Tinker is a dream for multi-agent setups, Nathan Lambert / @natolambert : Please add olmo3 @johnschulman2 et al. The goal is to make it the foundational research infrastructure for academic LLM work. Would love to have it be seamless with Tinker! Rowan Zellers / @rown : Today we are releasing Tinker to everyone, and now with vision input! You can now finetune a frontier Qwen3-VL-235B on your own image+text data, bringing your own algorithm (sft, RL, something else?). We'll take care of the GPU infra. Full update: https://thinkingmachines.ai/ ... Devendra Chaplot / @dchaplot : Tinker is now open to everyone! We are also adding: - Vision support with Qwen3-VL - New model: Kimi K2 Thinking (1T params) - OpenAI API-compatible inference Start training models within minutes: https://thinkingmachines.ai/ ... Mira Murati / @miramurati : We're making Tinker generally available with new models and features. @thinkymachines : Tinker is now generally available. We also added support for advanced vision input models, Kimi K2 Thinking, and a simpler way to sample from models. https://thinkingmachines.ai/ ... LinkedIn: Devendra Chaplot : Tinker is now open to everyone! — We are also adding: — Vision support with Qwen3-VL — New model: Kimi K2 Thinking (1T params) … Bluesky: Nate / @zzstoatzz.io : i made a little CLI to make this easy to try out the new Tinker API — you just need — uv — a Tinker API key (tinker-console.thinkingmachines.ai/ keys) — TINKER_API_KEY=<your-key> uvx —from git+https://github.com/zzstoatzz/ hello-tinker tinker-chat [embedded post]