22.02.2024 Views

Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Data Preparation

1 # Generating training data

2 points, directions = generate_sequences(n=256, seed=13)

3 full_train = torch.as_tensor(points).float()

4 target_train = full_train[:, 2:]

5 # Generating test data

6 test_points, test_directions = generate_sequences(seed=19)

7 full_test = torch.as_tensor(test_points).float()

8 source_test = full_test[:, :2]

9 target_test = full_test[:, 2:]

10 # Datasets and data loaders

11 train_data = TensorDataset(full_train, target_train)

12 test_data = TensorDataset(source_test, target_test)

13

14 generator = torch.Generator()

15 train_loader = DataLoader(train_data, batch_size=16,

16 shuffle=True, generator=generator)

17 test_loader = DataLoader(test_data, batch_size=16)

fig = plot_data(points, directions, n_rows=1)

Figure 10.14 - Seq2Seq dataset

The corners show the order in which they were drawn. In the first square, the

drawing started at the top-right corner and followed a clockwise direction. The

source sequence for that square would include corners on the right edge (1 and 2),

while the target sequence would include corners on the left edge (3 and 4), in that

order.

Model Configuration & Training

Let’s train our Transformer! We start by creating the corresponding "layers" for

both encoder and decoder, and use them both as arguments of the

EncoderDecoderTransf class:

The Transformer | 835

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!