stochastic-gates

Tag

Cards List
#stochastic-gates

Learning sparse neural networks through L₀ regularization

OpenAI Blog · 2017-12-04 Cached

OpenAI proposes a practical L₀ regularization method for neural networks that encourages weights to become exactly zero during training, enabling network pruning for improved speed and generalization. The method uses stochastic gates and introduces the hard concrete distribution to make the non-differentiable L₀ norm optimization tractable via gradient descent.

0 favorites 0 likes
← Back to home

Submit Feedback