@AdinaYakup: Qwen released WebWorld an open world model series for web agents 8B/14B/32B+Dataset Apache2.0 +9.9% MiniWob++, +10.9% W…
Summary
Qwen released WebWorld, an open-source model series for web agents (8B/14B/32B) under Apache 2.0, which improves performance on MiniWob++ and WebArena benchmarks.
View Cached Full Text
Cached at: 05/11/26, 08:35 AM
Qwen released WebWorld 🌍 an open world model series for web agents
✨ 8B/14B/32B+Dataset ✨Apache2.0 ✨+9.9% MiniWob++, +10.9% WebArena ✨ Matches Claude Opus 4.1 & Gemini 3 Pro on factuality,beats GPT-5 as world model ✨Unified action space, 30+ step simulation, 5 state https://t.co/X6RL4vxIqp
Similar Articles
Qwen/Qwen3.6-27B
Qwen releases the open-weight Qwen3.6-27B model on Hugging Face, featuring improved stability, agentic coding capabilities, and thinking preservation for better developer productivity.
Qwen 3.6 Max Preview just went live on the Qwen Chat website. It currently has the highest AA-Intelligence Index score among Chinese models (52) (Will it be open source?)
Qwen 3.6 Max Preview launched on Qwen Chat website, achieving the highest AA-Intelligence Index score (52) among Chinese models, with uncertainty about whether it will be open source.
Qwen3.6-27B
Alibaba's Qwen team released Qwen3.6-27B, a new 27-billion-parameter language model, accompanied by benchmark results.
Qwen/Qwen3.6-27B-FP8
Alibaba releases Qwen3.6-27B-FP8, a 27B FP8-quantized model with strong agentic coding and reasoning benchmarks, now available on Hugging Face.
(Interactive)OpenCode Racing Game Comparison Qwen3.6 35B vs Qwen3.5 122B vs Qwen3.5 27B vs Qwen3.5 4B vs Gemma 4 31B vs Gemma 4 26B vs Qwen3 Coder Next vs GLM 4.7 Flash
An informal benchmark comparing 8 AI models (Qwen3.6 35B, Qwen3.5 series, Gemma 4 series, GLM 4.7 Flash) in creating racing games via OpenCode/Playwright MCP, testing their coding agent capabilities and documenting various implementation quirks.