22.02.2024 Views

Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

w_nn_equiv = w_nn_output.mm(w_nn_hidden1.mm(w_nn_hidden0))

In my opinion, the sequence of operations looks more clear using "@" for matrix

multiplication.

Next, we need to compare them to the weights of the shallow model; that is, the

logistic regression:

w_logistic_output = model_logistic.output.weight.detach()

w_logistic_output.shape

Output

torch.Size([1, 25])

Same shape, as expected. If we compare the values one by one, we’ll find that they

are similar, but not quite the same. Let’s try to grasp the full picture by looking at a

picture (yes, pun intended!).

Figure 4.9 - Comparing weights of deep-ish and shallow models

On the left, we plot all 25 weights / parameters for both models. Even though they

are not quite the same, the similarity is striking. On the right, we can appreciate

that the weights are, indeed, highly correlated.

Deep-ish Model | 309

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!