22.02.2024 Views

Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

"What’s so special about it?"

Even though PyTorch recursively looks for models (and layers) listed as attributes

to get a comprehensive list of all parameters, it does not look for models inside

Python lists. Therefore, the only way to have a list of models (or layers) is to use

the appropriate nn.ModuleList, which you can still index and loop over just like

with any other regular list.

This chapter is so long that I’ve split it into two parts, so you can take a break now

and let the attention mechanism sink in before moving on to its sibling, the selfattention

mechanism.

TO BE CONTINUED…

[136] https://github.com/dvgodoy/PyTorchStepByStep/blob/master/Chapter09.ipynb

[137] https://colab.research.google.com/github/dvgodoy/PyTorchStepByStep/blob/master/Chapter09.ipynb

[138] https://papers.nips.cc/paper/2014/hash/a14ac55a4f27472c5d894ec1c3c743d2-Abstract.html

[139] https://bit.ly/3aEf81k

[140] http://jalammar.github.io/visualizing-neural-machine-translation-mechanics-of-seq2seq-models-withattention/

[141] https://towardsdatascience.com/attn-illustrated-attention-5ec4ad276ee3

Attention | 739

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!