22.02.2024 Views

Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

np.concatenate([dummy_points[:5].numpy(),

dummy_sbs.predict(dummy_points)[:5]], axis=1)

Output

array([[-0.9012059 , 0. ],

[ 0.56559485, 0.56559485],

[-0.48822638, 0. ],

[ 0.75069577, 0.75069577],

[ 0.58925384, 0.58925384], dtype=float32)

No surprises here, right? Since the ReLU can only return positive values, it will

never be able to produce the points with negative values.

"Wait, that doesn’t look right … where is the output layer?"

OK, you caught me! I suppressed the output layer on purpose to make a point here.

Please bear with me a little bit longer while I add a residual connection to the

model:

class DummyResidual(nn.Module):

def __init__(self):

super(DummyResidual, self).__init__()

self.linear = nn.Linear(1, 1)

self.activation = nn.ReLU()

def forward(self, x):

identity = x

1

out = self.linear(x)

out = self.activation(out)

out = out + identity 1

return out

1 Adding the output to the result

Guess what happens if we replace the Dummy model with the DummyResidual model

and retrain it?

548 | Chapter 7: Transfer Learning

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!