Agree will all comments aout the how clear BERT explained here. Still I struggle with understanding the transformer itself. Your video and many others I saw on transformer don't make it clear for me.
Nice madam.when we put out layers at the end of BERT model,what is the purpose of that layers?does it mean to determine pos ? Or something else plz clear my mind abt that.🙏 Thanks
My new favorite goto channel for short but insightful high-level explanations of models in ML 😀
Simp.
@@maestbobo says the guy who's a simp for men in boxers haha
This is massive tutorial, one or the best high level overview of Bert. Really nice
Another great session! I think that getting down to the code level and seeing such models closer up is a great next step.
Thanks a lot. It was perfect and so useful with simplicity. Thanks again!
this is the best explain video so far. Thank you so much
nice explanation, much better than googles video on BERT
The best explanation! Thanks many times
You are welcome!
Nice and crisp
Amazing explanation!
Thank you. Very good video
well explanation mam in short video.
Best Explanation !
Nice Explanation... Thanks
You're very welcome! - Mısra
A very good channel and very good videos.
Nice video!!!
3:50 4:20 do you mean encoding or embedding? 🤔
perfect
Well done
Agree will all comments aout the how clear BERT explained here. Still I struggle with understanding the transformer itself. Your video and many others I saw on transformer don't make it clear for me.
thanks
You're welcome!
Nice madam.when we put out layers at the end of BERT model,what is the purpose of that layers?does it mean to determine pos ? Or something else plz clear my mind abt that.🙏 Thanks
If you add layers after the BERT model, you would be doing fine tuning. This is done to make the BERT model apply to a specific case. -Mısra
@@AssemblyAI thanks alot Misra ❤️
What could be the architecture to understand BERT?
Articulately explained, thank you.