22.02.2024 Views

Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

This was actually implemented like that by design. We’re not discussing the

reasoning here, but, if you’d like to double-check the variance of the standardized

mini-batch, you can use the following:

normed1.var(axis=0, unbiased=False)

Output

tensor([1.0000, 1.0000])

That’s more like it! We can also plot the histograms again to more easily visualize

the effect of batch normalization.

Figure 7.9 - After batch normalization

Even though batch normalization achieved an output with zero mean and unit

standard deviation, the overall distribution of the output is still mostly determined

by the distribution of the inputs.

Batch normalization won’t turn it into a normal distribution.

If we feed the second mini-batch to the batch normalizer, it will update its running

statistics accordingly:

normed2 = batch_normalizer(batch2[0])

batch_normalizer.state_dict()

540 | Chapter 7: Transfer Learning

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!