22.02.2024 Views

Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Output

[Sentence: "The Hatter was the first to break the silence . ` What

day of the month is it ? ' he said , turning to Alice : he had taken

his watch out of his pocket , and was looking at it uneasily ,

shaking it every now and then , and holding it to his ear ." [

Tokens: 58],

Sentence: "Alice thought this a very curious thing , and she went

nearer to watch them , and just as she came up to them she heard one

of them say , ` Look out now , Five ! Do n't go splashing paint over

me like that !" [ Tokens: 48]]

Now, each document (a Sentence object) will have its own overall embedding:

documents[0].embedding

Output

tensor([-6.4245e-02, 3.5365e-01, -2.4962e-01, -5.3912e-01,

-1.9917e-01, -2.7712e-01, 1.6942e-01, 1.0867e-01,

...

7.4661e-02, -3.4777e-01, 1.5740e-01, 3.4407e-01,

-5.0272e-01, 1.7432e-01, 7.9398e-01, 7.3562e-01],

device='cuda:0',

grad_fn=<CatBackward>)

Notice that the individual tokens don’t get their own embeddings anymore:

documents[0].tokens[31].embedding

Output

tensor([], device='cuda:0')

960 | Chapter 11: Down the Yellow Brick Rabbit Hole

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!