Batch normalization

From David's Wiki
Revision as of 00:00, 13 August 2020 by David (talk | contribs)
\( \newcommand{\P}[]{\unicode{xB6}} \newcommand{\AA}[]{\unicode{x212B}} \newcommand{\empty}[]{\emptyset} \newcommand{\O}[]{\emptyset} \newcommand{\Alpha}[]{Α} \newcommand{\Beta}[]{Β} \newcommand{\Epsilon}[]{Ε} \newcommand{\Iota}[]{Ι} \newcommand{\Kappa}[]{Κ} \newcommand{\Rho}[]{Ρ} \newcommand{\Tau}[]{Τ} \newcommand{\Zeta}[]{Ζ} \newcommand{\Mu}[]{\unicode{x039C}} \newcommand{\Chi}[]{Χ} \newcommand{\Eta}[]{\unicode{x0397}} \newcommand{\Nu}[]{\unicode{x039D}} \newcommand{\Omicron}[]{\unicode{x039F}} \DeclareMathOperator{\sgn}{sgn} \def\oiint{\mathop{\vcenter{\mathchoice{\huge\unicode{x222F}\,}{\unicode{x222F}}{\unicode{x222F}}{\unicode{x222F}}}\,}\nolimits} \def\oiiint{\mathop{\vcenter{\mathchoice{\huge\unicode{x2230}\,}{\unicode{x2230}}{\unicode{x2230}}{\unicode{x2230}}}\,}\nolimits} \)

Batch norm is normalizing the mean and standard deviation of each mini-batch.
The goal is to speed up the training process.

Batch norm adds two trainable parameters to your network:

  • An average mean.
  • An average std dev.

During training, these two values are computed from the batch. During evaluation, it uses these two learned values to do normalization.

Batch Norm in CNNs

See Batch norm in CNN.

While batch norm is very common in CNNs, it can lead to unexpected side effects such as brightness changes.
You should avoid using batch norm if you need to make a video frame-by-frame.

In a CNN, the mean and standard deviation are calculated across the batch, width, and height of the features.

# t is still the incoming tensor of shape (B, H, W, C)
# but mean and stddev are computed along (0, 1, 2) axes and have just shape (C)
t_mean = mean(t, axis=(0, 1, 2))
t_stddev = stddev(t, axis=(0, 1, 2))
out = (t - t_mean.view(1,1,1,C)) / t_stddev.view(1,1,1,C)


Resources