22.02.2024 Views

Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

linear_input = nn.Linear(n_features, hidden_dim)

linear_hidden = nn.Linear(hidden_dim, hidden_dim)

with torch.no_grad():

linear_input.weight = nn.Parameter(rnn_state['weight_ih'])

linear_input.bias = nn.Parameter(rnn_state['bias_ih'])

linear_hidden.weight = nn.Parameter(rnn_state['weight_hh'])

linear_hidden.bias = nn.Parameter(rnn_state['bias_hh'])

Now, let’s work our way through the mechanics of the RNN cell! It all starts with

the initial hidden state representing the empty sequence:

initial_hidden = torch.zeros(1, hidden_dim)

initial_hidden

Output

tensor([[0., 0.]])

Then, we use the two blue neurons, the linear_hidden layer, to transform the

hidden state:

th = linear_hidden(initial_hidden)

th

Output

tensor([[-0.3565, -0.2904]], grad_fn=<AddmmBackward>)

Cool! Now, let’s take look at a sequence of data points from our dataset:

X = torch.as_tensor(points[0]).float()

X

Recurrent Neural Networks (RNNs) | 597

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!