@rohanpaul_ai: Thinking Machines is replacing turn-taking AI with always-present AI. They just announced TML-Interaction-Small, a 276B…
Summary
Thinking Machines announced TML-Interaction-Small, a 276B MoE model designed for real-time, always-on interaction with sub-0.4s latency and integrated multimodal processing.
Similar Articles
Interaction Models
Thinking Machines AI announces a research preview of interaction models, a new architecture designed for native, real-time human-AI collaboration across audio, video, and text. By replacing turn-based interfaces with a multi-stream, micro-turn design, the model aims to keep humans actively in the loop while delivering state-of-the-art intelligence and responsiveness.
@Saboo_Shubham_: This is not an Agent, just a single AI model. Thinking Machine just launched an interaction model that can simultaneous…
Thinking Machine launched a new multimodal AI model that can simultaneously listen, see, speak, interrupt, react, think, and use tools, demonstrating the convergence of models and agents.
Interaction Models from Thinking Machines Lab [P]
Thinking Machines Lab releases a research paper introducing new interaction models for AI systems.
Micro Language Models Enable Instant Responses
Researchers introduce 8M-30M parameter micro language models that instantly generate the first few words on-device before cloud models complete responses, enabling responsive AI on ultra-constrained devices like smartwatches.
tencent/HY-Embodied-0.5
Tencent releases HY-Embodied-0.5, a suite of foundation models designed for embodied AI agents featuring a Mixture-of-Transformers (MoT) architecture with efficient 2B and powerful 32B variants for real-world robot control and spatial-temporal reasoning.