22.02.2024 Views

Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Every gate worthy of its name will use a sigmoid activation

function to produce gate-compatible values between zero and

one.

Moreover, since all components of a GRU (n, r, and z) share a similar structure, it

should be no surprise that its corresponding transformations (t h and t x ) are also

similarly computed:

Equation 8.7 - Transformations of a GRU

See? They all follow the same logic! Actually, let’s literally see how all these

components are connected in the following diagram.

Gated Recurrent Units (GRUs) | 627

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!