BERT Neural Network - EXPLAINED!

Sdílet
Vložit
  • čas přidán 4. 06. 2024
  • Understand the BERT Transformer in and out.
    Follow me on M E D I U M: towardsdatascience.com/likeli...
    Please subscribe to keep me alive: czcams.com/users/CodeEmporiu...
    PLAYLISTS FROM MY CHANNEL
    ⭕ Reinforcement Learning: • Reinforcement Learning...
    Natural Language Processing: • Natural Language Proce...
    ⭕ Transformers from Scratch: • Natural Language Proce...
    ⭕ ChatGPT Playlist: • ChatGPT
    ⭕ Convolutional Neural Networks: • Convolution Neural Net...
    ⭕ The Math You Should Know : • The Math You Should Know
    ⭕ Probability Theory for Machine Learning: • Probability Theory for...
    ⭕ Coding Machine Learning: • Code Machine Learning
    MATH COURSES (7 day free trial)
    📕 Mathematics for Machine Learning: imp.i384100.net/MathML
    📕 Calculus: imp.i384100.net/Calculus
    📕 Statistics for Data Science: imp.i384100.net/AdvancedStati...
    📕 Bayesian Statistics: imp.i384100.net/BayesianStati...
    📕 Linear Algebra: imp.i384100.net/LinearAlgebra
    📕 Probability: imp.i384100.net/Probability
    OTHER RELATED COURSES (7 day free trial)
    📕 ⭐ Deep Learning Specialization: imp.i384100.net/Deep-Learning
    📕 Python for Everybody: imp.i384100.net/python
    📕 MLOps Course: imp.i384100.net/MLOps
    📕 Natural Language Processing (NLP): imp.i384100.net/NLP
    📕 Machine Learning in Production: imp.i384100.net/MLProduction
    📕 Data Science Specialization: imp.i384100.net/DataScience
    📕 Tensorflow: imp.i384100.net/Tensorflow
    REFERENCES
    [1] BERT main paper: arxiv.org/pdf/1810.04805.pdf
    [1] BERT in google search: blog.google/products/search/s...
    [2] Overview of BERT: arxiv.org/pdf/2002.12327v1.pdf
    [4] BERT word embeddings explained: / why-bert-has-3-embeddi...
    [5] More details of BERT in this amazing blog: towardsdatascience.com/bert-e...
    [6] Stanford lecture slides on BERT: nlp.stanford.edu/seminar/deta...

Komentáře • 328

  • @CodeEmporium
    @CodeEmporium  Před rokem +6

    For details and code on building a translator using a transformer neural network, check out my playlist "Transformers from scratch": czcams.com/video/QCJQG4DuHT0/video.html

  • @brilliantdirectoriestraining

    I have studied for several years AI, NPL and Neural networks. But the way you explained this was lovely, friendly and very simple which is why I am pretty sure you are BERT

  • @AmandeepSingh-xk4yv
    @AmandeepSingh-xk4yv Před 4 lety +241

    Just watched a video on Transformers, and now this. Am astounded at how you explained such complex notions with such ease!
    Hugely underrated channel!

  • @krishnaik06
    @krishnaik06 Před 3 lety +202

    Amazing Explanation :)

    • @doyourealise
      @doyourealise Před 3 lety +1

      big fan sir

    • @richarda1630
      @richarda1630 Před 3 lety

      hear hear! so agree!

    • @User-nq9ee
      @User-nq9ee Před 2 lety +2

      To teach us , you study and explore .. really grateful for your efforts Krish .

    • @indgaming5452
      @indgaming5452 Před rokem

      Where there is krish sir... I will come there. .... Sir I found u here also...
      We are learning together 😇

    • @itsme1674
      @itsme1674 Před rokem

      Nice to meet you sir

  • @jeenakk7827
    @jeenakk7827 Před 3 lety +24

    I wish I had come across this channel earlier. You have a wonderful skill in explaining complicated concepts. I love your "3 pass" approach!!

  • @andrewlai3358
    @andrewlai3358 Před 4 lety +2

    Thank you for the explanation. You really have a knack for explaining NLP concepts clearly without losing much fidelity. Please keep posting!

  • @mauriciolandos4712
    @mauriciolandos4712 Před rokem

    Best explanator on youtube, you have a good mix of simplifying so it can be understood, but not overly simplifying so we learn deeply enough. The idea of having 3 passes going deeper was a great idea as well.

  • @maverick3069
    @maverick3069 Před 3 lety +1

    The multiple passes of explanation is an absolutely brilliant way to explain! Thanks man.

  • @madhubagroy
    @madhubagroy Před 2 lety +1

    The BEST explanation on BERT. Simply outstanding!

  • @ankit_khanna
    @ankit_khanna Před 3 lety

    One of the best videos on BERT.
    Great work!
    Wishing you loads of success!

  • @dannysimon2965
    @dannysimon2965 Před 4 lety

    Wow, thanks!! I tried watching many videos and couldn't understand a single thing. But yours was truly concise and informative.

  • @healthertsy1863
    @healthertsy1863 Před 8 měsíci

    Don't hesitate, this is the best video of BERT explanation for sure!

  • @NadavBenedek
    @NadavBenedek Před 2 lety

    I love the "pass1" "pass2" concept of how you explain things. It's great.

  • @prasannabiswas2727
    @prasannabiswas2727 Před 3 lety +1

    I read many blogs on BERT but they were more focused on how to use BERT rather than what actually BERT is?. This video helped me clear all my doubts regarding how Bert is trained. Clear and concise explanation.

  • @BogdanSalyp
    @BogdanSalyp Před 3 lety +1

    Extremely underrated channel, didn't find any other good explanation on CZcams/Medium/Google

  • @AbdulWahab-cy9ib
    @AbdulWahab-cy9ib Před 3 lety +1

    Was struggling to understand the basics of BERT after going through Transformer model. This video was indeed helpful.

  • @abhikc8108
    @abhikc8108 Před 3 lety +3

    Great explanation, I really like the three pass idea it breaks down a lot of complications to simple concepts.

  • @rashmikuchhal5339
    @rashmikuchhal5339 Před 3 lety

    I always watch your videos and appreciate the efforts you put to make the complicated topics so easy and clear. Thankyou for all your work. I really like the way you explain in 3 passes.... great work

  • @MarketingLeap
    @MarketingLeap Před 3 lety

    Loved how you explained BERT really well. Great job!

  • @usamahussain4461
    @usamahussain4461 Před 2 lety +4

    Phenomenal the way you condense such a complicated concept into a few minutes, clearly explained.

  • @user-qx3jn9ii7s
    @user-qx3jn9ii7s Před 3 lety

    Hey, your explantation and presentation on complicated concepts made me clear about TF and BERT.
    I will expect you to upload more exciting videos.

  • @amreshgiri
    @amreshgiri Před 3 lety

    Probably the best (easiest to understand in one go) video on BERT. Thanks ❤️

  • @ziangtian
    @ziangtian Před rokem +1

    OMG!!!! This vid is a life-saver! just elucidated so many aspects of NLP to me (a 3 month beginner who still understands nothing

  • @aruncjohn
    @aruncjohn Před 3 lety

    Excellent explanation!. Will never miss a video of yours from now on!

  • @rohanmirchandani9726
    @rohanmirchandani9726 Před 3 lety

    This is one of the best resources explaining BERT available online.

  • @xJFGames
    @xJFGames Před rokem +5

    I'm honestly amazed of how you managed to transform a complex algorithm in a simple 10 minutes video. Much thanks to you, my final thesis appreciates you.

  • @Kumar08
    @Kumar08 Před 3 lety

    Fantastic explanation, Covered each and every point in the BERT.
    Looking forward for more videos on NLP.

  • @noumaaaan
    @noumaaaan Před 3 lety

    This explanation is pretty amazing! I have a presentation on this soon. Thank you so much!

  • @bidyapatip
    @bidyapatip Před 3 lety

    After reading lot of blogs, videos, I understood its so difficult network. But after going through this, I feel so easy to understand BERT(and its varient available in transformer library)

  • @pallavijog912
    @pallavijog912 Před 3 lety

    Wow.. just switched from another BERT explained video to this.. stark difference.. excellent explanation indeed.. thanks..

  • @NachoProblems
    @NachoProblems Před 7 měsíci

    Do you realize you are the only good description of how exactly fine tuning works I have found, and I've been researching for months. Thank you!!!

    • @CodeEmporium
      @CodeEmporium  Před 7 měsíci +1

      You are too kind. Thank you for the donation. You didn’t have to bug it is appreciated. Also super glad this content was useful! More of this to come

  • @rajeshve7211
    @rajeshve7211 Před 2 dny

    Best ever explanation of BERT! Finally understood how it works :)

  • @andymoses95
    @andymoses95 Před 4 lety +1

    The one which I was looking for the past 6 months.! Thanks a lot for making this.

  • @ruyanshou805
    @ruyanshou805 Před 3 lety

    Well explained! I have been looking for something like this for quite long!

  • @aeirya1686
    @aeirya1686 Před 6 měsíci

    Very very friendly, clear and masterful explanation. This is exactly what I was after. Thank you!

  • @sajaal-dabet148
    @sajaal-dabet148 Před 3 lety

    This is amazing! Crystal clear explanation, thanks a lot.

  • @ydas9125
    @ydas9125 Před rokem

    I am very impressed by the clarity and core focus of your explanations to describe such complex processes. Thank you.

    • @CodeEmporium
      @CodeEmporium  Před rokem

      You are very welcome. Thanks for watching and commenting :)

  • @mohamedsaaou9258
    @mohamedsaaou9258 Před 4 lety

    Wonderful, and the best explanation of BERT ever :) , Good Job , and big thank you !

  • @erfanshayegani3693
    @erfanshayegani3693 Před rokem

    Every time I watch this video I gain a better understanding of the procedure. Thanks a lot for the great content!!!

  • @michaelherrera4450
    @michaelherrera4450 Před 3 lety +5

    Dude this video is incredible. I cannot express how good you are at explaining

  • @arvindvasudevan45
    @arvindvasudevan45 Před 3 lety

    Excellent articulation of the concept. Thank you.

  • @magelauditore333
    @magelauditore333 Před 3 lety

    Such a underrated channel. Keep it up man

  • @felixakwerh5189
    @felixakwerh5189 Před 3 lety

    great video, i like the 3-pass method you used to explain the concepts

  • @LoLelfy
    @LoLelfy Před 3 lety

    Omg your videos are so good! So happy I found your channel, I'm binge watching everything :D

    • @CodeEmporium
      @CodeEmporium  Před 3 lety +1

      Glad you found my channel too! Thank you! Hope you enjoy them!

  • @pathuri86
    @pathuri86 Před rokem

    Excellent and concise explanation. Loved it. Thanks for this fantastic video.

  • @naveen6630
    @naveen6630 Před 3 lety

    Hey. Your playlists are great. Also, the BERT explanation was excellent by the way. I am having a thought of working with farm-haystack using flask, dockers, and BERT. So I wonder whether if it's kind of the same you have explained, if not could you share your knowledge on that with me. Thank you.

  • @Deddiward
    @Deddiward Před 3 lety

    Wow I've just discovered your channel, it's full of resources, very nice!

  • @akashsaha3921
    @akashsaha3921 Před 3 lety

    Well explained. Short and to the point

  • @josephselwan1652
    @josephselwan1652 Před 2 lety +1

    Amazing stuff. For visualization purposes, when you get into a deeper pass, I would recommend always adding the zooming effect for intuitive understanding. I am not sure about others, but when you do that, I instantly know "OK, now we are within this 'box' "

    • @CodeEmporium
      @CodeEmporium  Před 2 lety +2

      Good thought. I'll try to make this apparent in the future. Thanks!

  • @seeunkim4185
    @seeunkim4185 Před rokem

    Thank you so much for the clear explanation, I get the grip of the BERT now!

  • @mays7n
    @mays7n Před 10 měsíci

    this must be one of the best explanation videos on the internet, thank you!

  • @TSP-NPC
    @TSP-NPC Před 11 měsíci

    Thanks for the great explanation of Transformers and the architecture of BERT.

    • @CodeEmporium
      @CodeEmporium  Před 11 měsíci

      My pleasure and thank you for the super thanks :)

  • @keyrellousadib6124
    @keyrellousadib6124 Před rokem

    This is an excellent summary. Very clear and super well organized. Thanks very much

    • @CodeEmporium
      @CodeEmporium  Před rokem

      Thank you so much for watching ! And for the wonderful comment :$

  • @BiranchiNarayanNayak
    @BiranchiNarayanNayak Před 4 lety

    Best explanation i have seen so far on BERT.

  • @pratyushnegi4082
    @pratyushnegi4082 Před 2 lety

    Thanks for explaining this in such a simple way :)

  • @JP-gk6pi
    @JP-gk6pi Před 3 lety

    3 pass explanation is a really good approach to explain this complex concept. Best video on BERT

    • @beteaberra631
      @beteaberra631 Před 2 lety

      Great video! But what is pass 1, pass 2 and pass 3?

  • @somerset006
    @somerset006 Před 10 měsíci

    Nice job, man! Especially the multi-phase approach of explaining things, top to bottom.

    • @CodeEmporium
      @CodeEmporium  Před 10 měsíci

      Super happy you liked the approach. Thanks for commenting

  • @aitrends8901
    @aitrends8901 Před 2 lety

    Very nice high level understanding of Transformers...

  • @ilyasaroui7745
    @ilyasaroui7745 Před 4 lety

    Good touch to put the references on the description instead of on the slides

  • @agustindiaz3361
    @agustindiaz3361 Před 4 lety

    Your videos are great! Amazing work

  • @jan-rw2qx
    @jan-rw2qx Před rokem

    First half is exactly how much I need to understand right now, thank you :)

  • @TheMehrdadIE
    @TheMehrdadIE Před 3 lety

    Awesome! How simple you explain these models :)

  • @deepeshkumar4945
    @deepeshkumar4945 Před rokem

    dude , you are amazing , you explained the state of the art NLP model , in such a well explained and concise video . Thanks a ton for this video !!!!!!

    • @CodeEmporium
      @CodeEmporium  Před rokem +1

      You are super welcome. Thanks so much for commenting this!

  • @prathyusha5393
    @prathyusha5393 Před 4 lety

    Thanks a bunch for these wonderful videos!

  • @urd4651
    @urd4651 Před 2 lety

    thank you for sharing the video! very clear and helpful!!

  • @nooshnabi9248
    @nooshnabi9248 Před 2 lety

    Great Video! So easy to follow!

  • @aashnavaid6918
    @aashnavaid6918 Před 2 lety

    thank you so very much! one video was enough to get the basics clear

  • @jx7433
    @jx7433 Před 2 měsíci

    Excellent Explanation! Thank you!

  • @20.nguyenvongockhue80
    @20.nguyenvongockhue80 Před 2 měsíci

    wow AMAZING EXPLAINATION. Thank you very much

  • @rizwanulhoqueratul4811

    Thank you for the reference links!

  • @maryamaziz3841
    @maryamaziz3841 Před 3 lety

    Thanks for wonderful explanation for bert architecture 🍀🌹

  • @gauravchatterjee794
    @gauravchatterjee794 Před 3 lety

    Best Explanation by far!

  • @themaninjork
    @themaninjork Před 3 lety

    Your Videos are very well explained !

  • @gauravsahani2499
    @gauravsahani2499 Před 3 lety

    Loved the explanation, Sir!

  • @snowglade1
    @snowglade1 Před 3 lety

    Hey this is an exceptional training video.

  • @mohsennayebi86
    @mohsennayebi86 Před 2 lety

    Amazing Explanation! I am speechless :)

  • @ngavu8750
    @ngavu8750 Před rokem

    So simple and easy to understand, thanks a lot

  • @goelnikhils
    @goelnikhils Před rokem

    Excellent Explanation. Main thing to note is the finer point around explanation/mention of the loss functions that BERT uses. As not many other videos on same topic cover this. Tood Good

  • @beingnothing34
    @beingnothing34 Před 3 lety

    Amazing. Thanks a lot. You really **demystify** !!

  • @karinasamvelyan6082
    @karinasamvelyan6082 Před 3 lety +1

    Thanks a lot for this great and informative video!!
    I have a question: in the slide that explains the embedding (7:56) in the position embedding - shouldn't the E_6 actually be E_0 (like, 0th in sentence B)? Or the E_i-s are just your notation for whatever is supposed to be there? (in which case - no question :) ) Or are the two sentences considered as one long sentence? (Or did you actually mean it's one sentence?) :)

  • @arturjaroszewicz8424
    @arturjaroszewicz8424 Před 2 měsíci

    Beautiful explanation!

  • @namanbhayani1016
    @namanbhayani1016 Před 3 lety

    Very well explained!

  • @harshavardhany2970
    @harshavardhany2970 Před 3 lety +1

    Simple and clear explanations (which shows you know what you're talking about). And cool graphics. Will be back for more videos :)

  • @jeflee2787
    @jeflee2787 Před 3 lety

    Thanks man! Great work!

  • @rahuldey6369
    @rahuldey6369 Před 2 lety

    I've read 4 articles before coming here. Couldn't connect the dots. This single video showed me the way.. Thanks a lottt

  • @sudharshantr8757
    @sudharshantr8757 Před 2 lety

    At 7:55 , the position embeddings as said in the video encodes the position of a word in a sentence. But in the slide, the sequence of position embeddings if E0..E5,E6,E7..E10 instead of E0..E5,E0,E1..E5 (implying position embedding of a word depends on how the 2 sentences are arranged)

  • @dupontremi5638
    @dupontremi5638 Před rokem

    great explanation, I understood everything ! thanks a lot

    • @CodeEmporium
      @CodeEmporium  Před rokem

      Thanks so much for watching and commenting :)

  • @ErturkKadir
    @ErturkKadir Před 3 lety

    Awesome work. Are you planning to put some code snipped for BERT or GPT-3?

  • @fahnub
    @fahnub Před 9 měsíci

    Thank you so much for these videos.

    • @CodeEmporium
      @CodeEmporium  Před 9 měsíci +1

      You are welcome! Thanks so much for commenting! Really appreciate it :)

  • @then00bh3ro7
    @then00bh3ro7 Před 2 lety

    Hey, that's a great explanation of how BERT works. Unfortunately I still do not know how it actually learns language i.e. how the weight matrices get updated. All you said is that we use the cross entropy loss at the end but is there also a formula or algorithm which uses this loss in order to perform gradient descent and update the weights?

  • @renosyaputra
    @renosyaputra Před 3 lety

    Thank you, clear and direct explanation

  • @dhananjaysonawane1996
    @dhananjaysonawane1996 Před 3 lety

    @11:08 Please go on forever :):D... the best explanation I have seen so far, Thanks man!

  • @alidi5616
    @alidi5616 Před 2 lety

    That's it. The best explanation i came through. Receive my upvote and subscription 😁

  • @MrsMarsis
    @MrsMarsis Před 3 lety

    this is amazing, thank you very much!

  • @caoshixing7954
    @caoshixing7954 Před 4 lety

    Very good explanation, Easy to understand! Come on!

  • @medomed1105
    @medomed1105 Před 3 lety

    Amazing Explanation

  • @srinivasanthiruvengadathan2177

    Excellent Explanation. Thanks

  • @findingx152
    @findingx152 Před 10 měsíci

    great explanation.. Thanks !!

  • @teetanrobotics5363
    @teetanrobotics5363 Před 3 lety

    Could you please make a playlist that has ALL of your Deep learning , Computer Vision and NLP videos ?

  • @C-Los138
    @C-Los138 Před 2 lety

    Amazing video. Is there an explanation for the stacking aspect of the encoders? In the explanation we go over one encoder, but how many are stacked and why? Is it 12 stacked encoders for BERT-base and 24 for BERT-Large?