capability-transfer

Tag

Cards List
#capability-transfer

ReAD: Reinforcement-Guided Capability Distillation for Large Language Models

arXiv cs.CL · 12h ago Cached

This paper introduces ReAD, a reinforcement-guided capability distillation framework that optimizes token budgets by accounting for cross-capability transfer in large language models. It demonstrates improved downstream utility and reduced harmful spillover compared to existing baselines.

0 favorites 0 likes
← Back to home

Submit Feedback