pretraining-finetuning

Tag

Cards List
#pretraining-finetuning

Channel-Level Semantic Perturbations: Unlearnable Examples for Diverse Training Paradigms

arXiv cs.LG · 2d ago Cached

This paper systematically investigates unlearnable examples under diverse training paradigms, revealing that pretrained weights weaken existing methods, and proposes Shallow Semantic Camouflage (SSC) to maintain unlearnability by generating perturbations in a semantically valid subspace.

0 favorites 0 likes
← Back to home

Submit Feedback