22.02.2024 Views

Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

_l0_reverse).

Once again, let’s create two simple RNNs, and then use the weights and biases

above to set their weights accordingly. Each RNN will behave as one of the layers

from the bidirectional one:

rnn_forward = nn.RNN(input_size=2, hidden_size=2, batch_first=True)

rnn_reverse = nn.RNN(input_size=2, hidden_size=2, batch_first=True)

rnn_forward.load_state_dict(dict(list(state.items())[:4]))

rnn_reverse.load_state_dict(dict([(k[:-8], v)

for k, v in

list(state.items())[4:]]))

Output

<All keys matched successfully>

We’ll be using the same single-sequence batch from before, but we also need it in

reverse. We can use PyTorch’s flip() to reverse the dimension corresponding to

the sequence (L):

x_rev = torch.flip(x, dims=[1]) #N, L, F

x_rev

Output

tensor([[[-0.8670, 0.9342],

[-0.8251, -0.9499],

[ 0.8055, -0.9169],

[ 1.0349, 0.9661]]])

Since there is no dependency between the two layers, we just need to feed each

layer its corresponding sequence (regular and reversed) and remember to reverse

back the sequence of hidden states.

Recurrent Neural Networks (RNNs) | 613

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!