22.02.2024 Views

Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Model Configuration & Training

The model configuration is very straightforward: We create both encoder and

decoder models, use them as arguments to the large EncoderDecoder model that

handles the boilerplate, and create a loss and an optimizer as usual.

Model Configuration

1 torch.manual_seed(23)

2 encoder = Encoder(n_features=2, hidden_dim=2)

3 decoder = Decoder(n_features=2, hidden_dim=2)

4 model = EncoderDecoder(encoder, decoder,

5 input_len=2, target_len=2,

6 teacher_forcing_prob=0.5)

7 loss = nn.MSELoss()

8 optimizer = optim.Adam(model.parameters(), lr=0.01)

Next, we use the StepByStep class to train the model:

Model Training

1 sbs_seq = StepByStep(model, loss, optimizer)

2 sbs_seq.set_loaders(train_loader, test_loader)

3 sbs_seq.train(100)

fig = sbs_seq.plot_losses()

Figure 9.7 - Losses—encoder + decoder

Encoder-Decoder Architecture | 703

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!