BERT Neural Network - EXPLAINED!
Vložit
- čas přidán 4. 06. 2024
- Understand the BERT Transformer in and out.
Follow me on M E D I U M: towardsdatascience.com/likeli...
Please subscribe to keep me alive: czcams.com/users/CodeEmporiu...
PLAYLISTS FROM MY CHANNEL
⭕ Reinforcement Learning: • Reinforcement Learning...
Natural Language Processing: • Natural Language Proce...
⭕ Transformers from Scratch: • Natural Language Proce...
⭕ ChatGPT Playlist: • ChatGPT
⭕ Convolutional Neural Networks: • Convolution Neural Net...
⭕ The Math You Should Know : • The Math You Should Know
⭕ Probability Theory for Machine Learning: • Probability Theory for...
⭕ Coding Machine Learning: • Code Machine Learning
MATH COURSES (7 day free trial)
📕 Mathematics for Machine Learning: imp.i384100.net/MathML
📕 Calculus: imp.i384100.net/Calculus
📕 Statistics for Data Science: imp.i384100.net/AdvancedStati...
📕 Bayesian Statistics: imp.i384100.net/BayesianStati...
📕 Linear Algebra: imp.i384100.net/LinearAlgebra
📕 Probability: imp.i384100.net/Probability
OTHER RELATED COURSES (7 day free trial)
📕 ⭐ Deep Learning Specialization: imp.i384100.net/Deep-Learning
📕 Python for Everybody: imp.i384100.net/python
📕 MLOps Course: imp.i384100.net/MLOps
📕 Natural Language Processing (NLP): imp.i384100.net/NLP
📕 Machine Learning in Production: imp.i384100.net/MLProduction
📕 Data Science Specialization: imp.i384100.net/DataScience
📕 Tensorflow: imp.i384100.net/Tensorflow
REFERENCES
[1] BERT main paper: arxiv.org/pdf/1810.04805.pdf
[1] BERT in google search: blog.google/products/search/s...
[2] Overview of BERT: arxiv.org/pdf/2002.12327v1.pdf
[4] BERT word embeddings explained: / why-bert-has-3-embeddi...
[5] More details of BERT in this amazing blog: towardsdatascience.com/bert-e...
[6] Stanford lecture slides on BERT: nlp.stanford.edu/seminar/deta...
For details and code on building a translator using a transformer neural network, check out my playlist "Transformers from scratch": czcams.com/video/QCJQG4DuHT0/video.html
I have studied for several years AI, NPL and Neural networks. But the way you explained this was lovely, friendly and very simple which is why I am pretty sure you are BERT
Just watched a video on Transformers, and now this. Am astounded at how you explained such complex notions with such ease!
Hugely underrated channel!
Thanks a lot! Glad you liked it 😊
Amazing Explanation :)
big fan sir
hear hear! so agree!
To teach us , you study and explore .. really grateful for your efforts Krish .
Where there is krish sir... I will come there. .... Sir I found u here also...
We are learning together 😇
Nice to meet you sir
I wish I had come across this channel earlier. You have a wonderful skill in explaining complicated concepts. I love your "3 pass" approach!!
Thank you for the explanation. You really have a knack for explaining NLP concepts clearly without losing much fidelity. Please keep posting!
Best explanator on youtube, you have a good mix of simplifying so it can be understood, but not overly simplifying so we learn deeply enough. The idea of having 3 passes going deeper was a great idea as well.
The multiple passes of explanation is an absolutely brilliant way to explain! Thanks man.
The BEST explanation on BERT. Simply outstanding!
One of the best videos on BERT.
Great work!
Wishing you loads of success!
Wow, thanks!! I tried watching many videos and couldn't understand a single thing. But yours was truly concise and informative.
Don't hesitate, this is the best video of BERT explanation for sure!
I love the "pass1" "pass2" concept of how you explain things. It's great.
I read many blogs on BERT but they were more focused on how to use BERT rather than what actually BERT is?. This video helped me clear all my doubts regarding how Bert is trained. Clear and concise explanation.
Extremely underrated channel, didn't find any other good explanation on CZcams/Medium/Google
Was struggling to understand the basics of BERT after going through Transformer model. This video was indeed helpful.
Great explanation, I really like the three pass idea it breaks down a lot of complications to simple concepts.
I always watch your videos and appreciate the efforts you put to make the complicated topics so easy and clear. Thankyou for all your work. I really like the way you explain in 3 passes.... great work
Loved how you explained BERT really well. Great job!
Phenomenal the way you condense such a complicated concept into a few minutes, clearly explained.
Thanks so much for the compliments:)
Hey, your explantation and presentation on complicated concepts made me clear about TF and BERT.
I will expect you to upload more exciting videos.
Probably the best (easiest to understand in one go) video on BERT. Thanks ❤️
OMG!!!! This vid is a life-saver! just elucidated so many aspects of NLP to me (a 3 month beginner who still understands nothing
Excellent explanation!. Will never miss a video of yours from now on!
This is one of the best resources explaining BERT available online.
I'm honestly amazed of how you managed to transform a complex algorithm in a simple 10 minutes video. Much thanks to you, my final thesis appreciates you.
Hahaha anytime ):
Fantastic explanation, Covered each and every point in the BERT.
Looking forward for more videos on NLP.
This explanation is pretty amazing! I have a presentation on this soon. Thank you so much!
After reading lot of blogs, videos, I understood its so difficult network. But after going through this, I feel so easy to understand BERT(and its varient available in transformer library)
Wow.. just switched from another BERT explained video to this.. stark difference.. excellent explanation indeed.. thanks..
Do you realize you are the only good description of how exactly fine tuning works I have found, and I've been researching for months. Thank you!!!
You are too kind. Thank you for the donation. You didn’t have to bug it is appreciated. Also super glad this content was useful! More of this to come
Best ever explanation of BERT! Finally understood how it works :)
The one which I was looking for the past 6 months.! Thanks a lot for making this.
I'm late. But. Here now!
Well explained! I have been looking for something like this for quite long!
Very very friendly, clear and masterful explanation. This is exactly what I was after. Thank you!
This is amazing! Crystal clear explanation, thanks a lot.
I am very impressed by the clarity and core focus of your explanations to describe such complex processes. Thank you.
You are very welcome. Thanks for watching and commenting :)
Wonderful, and the best explanation of BERT ever :) , Good Job , and big thank you !
Every time I watch this video I gain a better understanding of the procedure. Thanks a lot for the great content!!!
Anytime! Look forward for more
Dude this video is incredible. I cannot express how good you are at explaining
Thanks for watching! Super glad it is useful
Good try though!
Excellent articulation of the concept. Thank you.
Such a underrated channel. Keep it up man
great video, i like the 3-pass method you used to explain the concepts
Omg your videos are so good! So happy I found your channel, I'm binge watching everything :D
Glad you found my channel too! Thank you! Hope you enjoy them!
Excellent and concise explanation. Loved it. Thanks for this fantastic video.
Hey. Your playlists are great. Also, the BERT explanation was excellent by the way. I am having a thought of working with farm-haystack using flask, dockers, and BERT. So I wonder whether if it's kind of the same you have explained, if not could you share your knowledge on that with me. Thank you.
Wow I've just discovered your channel, it's full of resources, very nice!
Well explained. Short and to the point
Amazing stuff. For visualization purposes, when you get into a deeper pass, I would recommend always adding the zooming effect for intuitive understanding. I am not sure about others, but when you do that, I instantly know "OK, now we are within this 'box' "
Good thought. I'll try to make this apparent in the future. Thanks!
Thank you so much for the clear explanation, I get the grip of the BERT now!
this must be one of the best explanation videos on the internet, thank you!
You are very welcome :)
Thanks for the great explanation of Transformers and the architecture of BERT.
My pleasure and thank you for the super thanks :)
This is an excellent summary. Very clear and super well organized. Thanks very much
Thank you so much for watching ! And for the wonderful comment :$
Best explanation i have seen so far on BERT.
I thank you kindly ;)
Thanks for explaining this in such a simple way :)
3 pass explanation is a really good approach to explain this complex concept. Best video on BERT
Great video! But what is pass 1, pass 2 and pass 3?
Nice job, man! Especially the multi-phase approach of explaining things, top to bottom.
Super happy you liked the approach. Thanks for commenting
Very nice high level understanding of Transformers...
Good touch to put the references on the description instead of on the slides
Your videos are great! Amazing work
First half is exactly how much I need to understand right now, thank you :)
Awesome! You are very welcome!
Awesome! How simple you explain these models :)
dude , you are amazing , you explained the state of the art NLP model , in such a well explained and concise video . Thanks a ton for this video !!!!!!
You are super welcome. Thanks so much for commenting this!
Thanks a bunch for these wonderful videos!
thank you for sharing the video! very clear and helpful!!
Great Video! So easy to follow!
thank you so very much! one video was enough to get the basics clear
Glad! Welcome!
Excellent Explanation! Thank you!
wow AMAZING EXPLAINATION. Thank you very much
Thank you for the reference links!
Thanks for wonderful explanation for bert architecture 🍀🌹
Best Explanation by far!
Your Videos are very well explained !
Loved the explanation, Sir!
Hey this is an exceptional training video.
Amazing Explanation! I am speechless :)
So simple and easy to understand, thanks a lot
Super glad this was useful :)
Excellent Explanation. Main thing to note is the finer point around explanation/mention of the loss functions that BERT uses. As not many other videos on same topic cover this. Tood Good
Thanks so much :)
Amazing. Thanks a lot. You really **demystify** !!
Thanks a lot for this great and informative video!!
I have a question: in the slide that explains the embedding (7:56) in the position embedding - shouldn't the E_6 actually be E_0 (like, 0th in sentence B)? Or the E_i-s are just your notation for whatever is supposed to be there? (in which case - no question :) ) Or are the two sentences considered as one long sentence? (Or did you actually mean it's one sentence?) :)
Beautiful explanation!
Very well explained!
Simple and clear explanations (which shows you know what you're talking about). And cool graphics. Will be back for more videos :)
Thanks! Keep tuning in!
Thanks man! Great work!
I've read 4 articles before coming here. Couldn't connect the dots. This single video showed me the way.. Thanks a lottt
Super glad it helped :)
At 7:55 , the position embeddings as said in the video encodes the position of a word in a sentence. But in the slide, the sequence of position embeddings if E0..E5,E6,E7..E10 instead of E0..E5,E0,E1..E5 (implying position embedding of a word depends on how the 2 sentences are arranged)
great explanation, I understood everything ! thanks a lot
Thanks so much for watching and commenting :)
Awesome work. Are you planning to put some code snipped for BERT or GPT-3?
Thank you so much for these videos.
You are welcome! Thanks so much for commenting! Really appreciate it :)
Hey, that's a great explanation of how BERT works. Unfortunately I still do not know how it actually learns language i.e. how the weight matrices get updated. All you said is that we use the cross entropy loss at the end but is there also a formula or algorithm which uses this loss in order to perform gradient descent and update the weights?
Thank you, clear and direct explanation
@11:08 Please go on forever :):D... the best explanation I have seen so far, Thanks man!
That's it. The best explanation i came through. Receive my upvote and subscription 😁
Many thanks. Join the discord too :3
this is amazing, thank you very much!
Very good explanation, Easy to understand! Come on!
Amazing Explanation
Excellent Explanation. Thanks
great explanation.. Thanks !!
Could you please make a playlist that has ALL of your Deep learning , Computer Vision and NLP videos ?
Amazing video. Is there an explanation for the stacking aspect of the encoders? In the explanation we go over one encoder, but how many are stacked and why? Is it 12 stacked encoders for BERT-base and 24 for BERT-Large?