22.02.2024 Views

Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

58 # N, 1, H x N, H, L -> N, 1, L

59 dot_products = torch.bmm(proj_query,

60 self.proj_keys.permute(0, 2, 1))

61 scores = dot_products / np.sqrt(self.d_k)

62 return scores

63

64 def forward(self, query, mask=None):

65 # Query is batch-first N, 1, H

66 scores = self.score_function(query) # N, 1, L

67 if mask is not None:

68 scores = scores.masked_fill(mask == 0, -1e9)

69 alphas = F.softmax(scores, dim=-1) # N, 1, L

70 self.alphas = alphas.detach()

71

72 # N, 1, L x N, L, H -> N, 1, H

73 context = torch.bmm(alphas, self.values)

74 return context

Model Configuration & Training

Model Configuration

1 torch.manual_seed(43)

2 encpe = EncoderPe(n_heads=3, d_model=2, ff_units=10, n_features=2)

3 decpe = DecoderPe(n_heads=3, d_model=2, ff_units=10, n_features=2)

4 model = EncoderDecoderSelfAttn(encpe, decpe,

5 input_len=2, target_len=2)

6 loss = nn.MSELoss()

7 optimizer = optim.Adam(model.parameters(), lr=0.01)

Model Training

1 sbs_seq_selfattnpe = StepByStep(model, loss, optimizer)

2 sbs_seq_selfattnpe.set_loaders(train_loader, test_loader)

3 sbs_seq_selfattnpe.train(100)

sbs_seq_selfattnpe.losses[-1], sbs_seq_selfattnpe.val_losses[-1]

792 | Chapter 9 — Part II: Sequence-to-Sequence

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!