Batch Normalization

From Deep Learning Wiki
Jump to: navigation, search

Batch normalization is a technique to normalize the input to a neural network layer in order to shift inputs to unit variance and zero mean. It is the process of normalizing the data in each minibatch during the optimization.

testing math: x=\frac{-b+\sqrt{d}}{2 a}