Posted On November 7, 2023
Word embedding is the first step in lots of neural networks, including Transformers (like ChatGPT) and other state of the models. Here we learn how to code a stand alone word embedding network from scratch and with nn.Linear. We then learn how to load and use pre-trained word embedding values with nn.Embedding.
NOTE: This StatQuest assumes that you are already familiar with Word Embedding, if not, check out the 'Quest: https://youtu.be/viZrOnJclY0
If you'd like to support StatQuest, please consider...
Patreon: https://www.patreon.com/statquest
...or...
YouTube Membership: https://www.youtube.com/channel/UCtYLUTtgS3k1Fg4y5tAhLbw/join
...buying my book, a study guide, a t-shirt or hoodie, or a song from the StatQuest store...
https://statquest.org/statquest-store/
...or just donating to StatQuest!
paypal: https://www.paypal.me/statquest
venmo: @JoshStarmer
Lastly, if you want to keep up with me as I research and create new StatQuests, follow me on twitter:
https://twitter.com/joshuastarmer
0:00 Awesome song and introduction
1:53 Importing modules
2:48 Encoding the training data
6:55 Word Embedding from scratch
16:54 Graphing the embedding values
21:17 Printing out predicted words
20:37 Word Embedding with nn.Linear
28:12 Loading and using pre-trained Embedding values with nn.Embedding
#StatQuest #neuralnetworks #transformers