22.02.2024 Views

Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Model Configuration & Training

Let’s build a Sequential model to classify our sentences according to their source

(Alice’s Adventures in Wonderland or The Wonderful Wizard of Oz) using PyTorch’s

nn.EmbeddingBag:

Model Configuration

1 extended_embeddings = torch.as_tensor(

2 extended_embeddings

3 ).float()

4 boe_mean = nn.EmbeddingBag.from_pretrained(

5 extended_embeddings, mode='mean'

6 )

7 torch.manual_seed(41)

8 model = nn.Sequential(

9 # Embeddings

10 boe_mean,

11 # Classifier

12 nn.Linear(boe_mean.embedding_dim, 128),

13 nn.ReLU(),

14 nn.Linear(128, 1)

15 )

16 loss_fn = nn.BCEWithLogitsLoss()

17 optimizer = optim.Adam(model.parameters(), lr=0.01)

The model is quite simple and straightforward: The bag-of-embeddings generates a

batch of average embeddings (each sentence is represented by a tensor of

embedding_dim dimensions), and those embeddings work as features for the

classifier part of the model.

We can train the model in the usual way:

Model Training

1 sbs_emb = StepByStep(model, loss_fn, optimizer)

2 sbs_emb.set_loaders(train_loader, test_loader)

3 sbs_emb.train(20)

fig = sbs_emb.plot_losses()

Word Embeddings | 941

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!