Part 1 | Training Word Embeddings | Word2Vec
Vložit
- čas přidán 27. 07. 2024
- In this video, we will learn about training word embeddings. To train word embeddings, we need to solve a fake problem. This problem is something that we do not care about. What we care about are the weights that are obtained after training the model. These weights are extracted and they act as word embeddings.
This is part 1/2 for training word embeddings. In this part, we only understand the theory behind training word embeddings. In the next part, we will code the same in python.
➖➖➖➖➖➖➖➖➖➖➖➖➖➖➖
Timestamps:
0:00 Intro
0:54 Training Word Embedding
9:18 End
➖➖➖➖➖➖➖➖➖➖➖➖➖➖➖
Follow my entire playlist on Recurrent Neural Network (RNN) :
📕 RNN Playlist: • What is Recurrent Neur...
➖➖➖➖➖➖➖➖➖➖➖➖➖➖➖
✔ CNN Playlist: • What is CNN in deep le...
✔ Complete Neural Network: • How Neural Networks wo...
✔ Complete Logistic Regression Playlist: • Logistic Regression Ma...
✔ Complete Linear Regression Playlist: • What is Linear Regress...
➖➖➖➖➖➖➖➖➖➖➖➖➖➖➖
If you want to ride on the Lane of Machine Learning, then Subscribe ▶ to my channel here: / @codinglane
Great explanation..now I know the exact intuition of the word embedding..Thank you for the video.
Glad it was helpful! Means a lot to me.🙂
Thank you for this easy to understand explanation
Fantastic explanation
Thank you!
great explanation
Such a great tutorial. Thank you!
Welcome!
how to determine choose the number of hiden layers
I have a question, it do'nt have memory? How it store the training?
Can word embedding applied on DNA strings
May be you can… you might need to study some research papers who have done the same