batch-normalization

Tag

Cards List
#batch-normalization

Weight normalization: A simple reparameterization to accelerate training of deep neural networks

OpenAI Blog · 2016-02-25 Cached

OpenAI presents weight normalization, a reparameterization technique that decouples weight vector length from direction to improve neural network training convergence and computational efficiency without introducing minibatch dependencies, making it suitable for RNNs and noise-sensitive applications.

0 favorites 0 likes
← Back to home

Submit Feedback