prompt-caching

Tag

Cards List
#prompt-caching

@0xMovez: Anthropic Head of Product just dropped a 28-minute masterclass on how to put agents into production with real-world use…

X AI KOLs Timeline · 2d ago

Anthropic's Head of Product released a free 28-minute masterclass on putting AI agents into production, covering prompt caching, tool search, programmatic tool calling, compaction, and advisor strategy.

0 favorites 0 likes
#prompt-caching

@gneubig: "The Math Behind the Cost of AI Agents" Nice, clear, tutorial by Vasco Schiavo at @OpenHandsDev on why agents can be ex…

X AI KOLs Following · 2d ago Cached

A tutorial by Vasco Schiavo explaining the math behind the cost of AI agents, focusing on why agents can be expensive and the importance of prompt caching.

0 favorites 0 likes
#prompt-caching

prompt caching, but for rl training - 7.5x speedup on long-prompt/short-response workloads

Reddit r/LocalLLaMA · 4d ago

A new optimization technique for open-source RL training engines introduces prompt caching during training, achieving up to 7.5x speedup on long-prompt, short-response workloads by reducing redundant compute.

0 favorites 0 likes
#prompt-caching

Anthropic says OpenClaw-style Claude CLI usage is allowed again

Hacker News Top · 2026-04-21 Cached

OpenClaw is a CLI tool that supports Anthropic Claude models via API key or Claude CLI reuse, with features including adaptive thinking defaults for Claude 4.6, fast mode service tier toggling, and configurable prompt caching. Anthropic has reportedly re-allowed OpenClaw-style Claude CLI usage.

0 favorites 0 likes
#prompt-caching

Prompt Caching in the API

OpenAI Blog · 2024-10-01 Cached

OpenAI introduces Prompt Caching, an automatic feature that reduces API costs by 50% and improves latency by reusing recently cached input tokens on GPT-4o, GPT-4o mini, o1-preview, and o1-mini models. The feature automatically applies to prompts longer than 1,024 tokens without requiring developer integration changes.

0 favorites 0 likes
← Back to home

Submit Feedback