Tag
Google is ending its free web search index for site-specific searches beyond 50 domains, while Cloudflare and Go-Daddy are blocking AI bots from scraping web data, potentially impacting local AI models that rely on internet access.
Cloudflare publishes its internal architecture for securely running Model Context Protocol (MCP) agents, introducing 'Code Mode' to reduce token usage by 99.9% and advocating for centralized remote server governance over local deployments.
Cloudflare details a bug in their QUIC implementation, quiche, where a Linux kernel optimization for CUBIC congestion control caused performance issues, and describes the resulting fix.
The author introduces an open-source MCP server running on Cloudflare Workers that provides persistent, searchable memory for AI clients like Claude, ChatGPT, and Cursor using vector embeddings and duplicate detection.
A DDoS attack temporarily disrupted Canonical's public infrastructure after attackers used a stresser service explicitly marketed to bypass Cloudflare. The incident has sparked debate over Cloudflare's dual role in hosting both the attackers' commercial operations and Canonical's protected systems, raising questions about potential conflicts of interest.
The author examines the decline of RSS feeds due to scraping and interference, arguing that modern feed readers must integrate alternative syndication methods to remain relevant.
Anthropic's 'Code Mode' reframes the MCP vs CLI debate by having AI agents write code to call tools via a runtime rather than loading full schemas into context, drastically reducing token usage. This approach combines MCP's typed contracts with lazy loading, proving the protocol is evolving rather than dying.
The article discusses a shift in AI agent tool usage from the 'MCP vs CLI' debate to 'Code Mode,' where agents write code to dynamically import tools, significantly reducing context window usage. It highlights Anthropic's approach and Cloudflare's implementation, demonstrating a 98.7% reduction in token consumption for specific tasks.
Cloudflare reported Q1 2026 earnings that beat expectations but announced a 20% workforce reduction of 1,100 employees due to a shift toward an agentic AI-first operating model, causing shares to drop 24%.
Cloudflare announced a workforce reduction of approximately 20% as part of its strategy to build for the future.
This article outlines 10 principles for designing agent-native Command Line Interfaces (CLIs), drawing from experiences with Cloudflare and HeyGen to improve reliability for AI agents.
Kimi 2.6 just dropped with a huge promo push; you can now run it free on Cloudflare. Kilo Code also offers unlimited free access to MiniMax 2.7 and the Doubao engine via the Kilo Gateway.
Cloudflare and OpenAI have partnered to make OpenAI's frontier models, including GPT-5.4, directly accessible within Cloudflare Agent Cloud, enabling enterprises to deploy AI agents for real-world tasks at scale. The integration also includes Codex tools now generally available in Cloudflare Sandboxes and upcoming availability in Workers AI.