overparameterization

Tag

Cards List
#overparameterization

Deep double descent

OpenAI Blog · 2019-12-05 Cached

OpenAI research reveals the 'double descent' phenomenon where test error exhibits a non-monotonic pattern as both model size and training steps increase, challenging traditional understanding of the bias-variance tradeoff in deep learning.

0 favorites 0 likes
← Back to home

Submit Feedback