22.02.2024 Views

Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Output

tensor([[-0.1340, -0.0004]], grad_fn=<MulBackward0>)

Next, we compute the gated cell state using the old cell state (c) and its

corresponding gate, the forget (f) gate:

f = forget_gate(initial_hidden, first_corner)

gated_cell = initial_cell * f

gated_cell

Output

tensor([[0., 0.]], grad_fn=<MulBackward0>)

Well, that’s kinda boring—since the old cell state is the initial cell state for the first

data point in a sequence, gated or not, it will be a bunch of zeros.

The new, updated cell state (c') is simply the sum of both the gated input and the

gated cell state:

c_prime = gated_cell + gated_input

c_prime

Output

tensor([[-0.1340, -0.0004]], grad_fn=<AddBackward0>)

The only thing missing is "converting" the cell state to a new hidden state (h') using

the hyperbolic tangent and the output (o) gate:

o = output_gate(initial_hidden, first_corner)

h_prime = o * torch.tanh(c_prime)

h_prime

648 | Chapter 8: Sequences

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!