Video není dostupné.
Omlouváme se.

Word2vec Complete Tutorial | CBOW and Skip-gram | Game of Thrones Word2vec

Sdílet
Vložit
  • čas přidán 19. 08. 2024
  • Code used: github.com/cam...
    colab.research...
    ============================
    Do you want to learn from me?
    Check my affordable mentorship program at : learnwith.camp...
    ============================
    📱 Grow with us:
    CampusX' LinkedIn: / campusx-official
    CampusX on Instagram for daily tips: / campusx.official
    My LinkedIn: / nitish-singh-03412789
    Discord: / discord
    E-mail us at support@campusx.in
    ⌚Time Stamps⌚
    00:57 - word embeddings
    04:48 - what is word2vec?
    09:34 - w2v banta kaise hai?
    10:00 - w2v Demo
    21:04 - w2v intuition
    34:33 - Types of w2v architectures
    35:52 - CBoW
    50:28 - Skip-gram
    56:23 - Training own model - GoT data
    01:14:23 - assignment

Komentáře • 143

  • @subandhu39
    @subandhu39 Před rokem +58

    I spent the past 2 days trying to study word2vec. Visited all the favourite yt channels, read articles on medium, even tried and read the paper from arxiv. This video most beautifully covered and explained word2vec. Amazing made me feel i actually did learn word2vec without any point being left. The diagrammatic explanation of CBOW and skip-grams took it away.

  • @anoopkaur6119
    @anoopkaur6119 Před 8 měsíci +12

    I just wanted to drop a quick note to say how much I appreciate your CZcams videos. Your teaching style is fantastic, and the content is incredibly helpful. Thank you for making learning so enjoyable and accessible.

  • @utkar1
    @utkar1 Před 2 lety +34

    @00:57 - word embeddings
    @04:48 - what is word2vec?
    @09:34 - w2v banta kaise hai?
    @10:00 - w2v Demo
    @21:04 - w2v intuition
    @34:33 - Types of w2v architectures
    @35:52 - CBoW
    @50:28 - Skip-gram
    @56:23 - Training own model - GoT data
    @01:14:23 - assignment

    • @MELIS_I
      @MELIS_I Před 2 lety +6

      World needs more people like you bro 🫡🫡🫡

    • @diwakargupta0
      @diwakargupta0 Před rokem

      @campusX you can add this timestamp/chapters to your video for better reach. Thanks for the awesome content Sir.

    • @nehapathak4932
      @nehapathak4932 Před rokem

      Hi Utkarsh, I am new to this course may I know why we have to get the final vector only by subtraction and addition, is there some other ways for the same? Kindly explain.

    • @bankoleayoade8852
      @bankoleayoade8852 Před rokem

      Speak English. I m not an indian

  • @varunraste3538
    @varunraste3538 Před rokem +6

    I dont know why but this guy is so underrated ! His content is to be promoted , cant believe I'm getting to see such content for free ! Thank you Sir

  • @lijindurairaj2982
    @lijindurairaj2982 Před 2 lety +5

    never found so easy to understand explanation in whole of the internet

  • @art4eigen93
    @art4eigen93 Před 2 lety +16

    I have never seen a video that has gone through minute details of word2vec except yours. Thanks a lot .

  • @youhavetodoit9070
    @youhavetodoit9070 Před rokem +5

    I Never Thought That One Day,The Knowledge OF " Game OF Thrones " would help me to understand the ML concept :D , Between Great Explanation 🙌🏻🙌🏻 Like Always Thank You So Much :)

  • @IRFANSAMS
    @IRFANSAMS Před 2 lety +22

    @Campusx Thank you so much sir..I really love your videos please continue this playlist till BERT,ALBERT,DISTILBERT and GPT also

    • @chandank5266
      @chandank5266 Před 10 měsíci +1

      yes much needed.......but ig his last video on this playlist is 1 year ago......sir please continue.....nobody can match your teaching skills

    • @ashishparulekar8009
      @ashishparulekar8009 Před 10 měsíci

      While training where we have choses the architecture..? Cbow or skip gram

  • @raj4624
    @raj4624 Před 2 lety +2

    finalllyyy .after 2 weeks wait krne k badd...aaahigaya..hope sir aapki holdiay achi gyi hogi...will watch and then comment later if i willl have doubts

  • @sachinkapoor2424
    @sachinkapoor2424 Před 3 měsíci +1

    sir apki har new video dekhkar lgta hai ek hi dil hai kitni baar jitoge

  • @sachink9102
    @sachink9102 Před 3 měsíci

    OMG ... man 100% you have followed the "feynman technique of learning". Awesome !!!

  • @nidhisolanki5314
    @nidhisolanki5314 Před 2 lety +4

    Sir, thank you so much for your number 1, superb lectures.Pls keep making such type of lectures.After watching your lecture, I am very much satisfied and happy with complete concept understanding.

  • @kmishy
    @kmishy Před 11 měsíci +1

    underrated channel....no one is teaching like u...higher edu is though difficult

  • @nayanparnami8554
    @nayanparnami8554 Před 2 lety +3

    A very indepth and great explaination for word 2vec...the best one by far..!! Thanks for the videos..👍👍

  • @ShubhamKumar-xt3vi
    @ShubhamKumar-xt3vi Před 2 lety +1

    very well explained the intuition , now i can finally say i know word2vec.
    Thank you so much for putting such effort.

  • @deepaknimbalkar8295
    @deepaknimbalkar8295 Před 2 lety +2

    Big Thank You Sir (Guru). This NLP series is very helpful for us.

  • @nomannosher8928
    @nomannosher8928 Před měsícem +1

    I first like the video, then watch it, because I know, this will be the best Explanation.

  • @utkarshtripathi9118
    @utkarshtripathi9118 Před 2 lety +1

    Aap Bahut acha padhate ho👍👍
    Pls sir Continue

  • @subodhsingh4878
    @subodhsingh4878 Před rokem +1

    Best explanation for w2v that i ever came across ,was struggling with the actual detailed info but this let all the case to rest . Thank you very much for making such an detailed explanation

  • @SWAY222
    @SWAY222 Před 8 měsíci

    Immensely helpful content, very well structured and explained.Thank you for sharing this on youtube while people are charging hefty amount for the same. Appreciate your intentions of social welfare.

  • @shankar7588
    @shankar7588 Před 2 lety +1

    I like the way how clearly the statements are addressed.. lot to learn. Thanks

  • @vaibhavborkar2075
    @vaibhavborkar2075 Před rokem +1

    Sir.. You are really amazing person.... Your way of explanation is great.. It is easily understandable...one of the best channel CampusX☺

  • @gnial2001
    @gnial2001 Před 27 dny

    thank you. watching in 2024 but very informative and your teaching style is very good

  • @somdubey5436
    @somdubey5436 Před 6 měsíci

    very well explained. Its a blessing to see content from such a great and passionate teacher that too free of cost.... I don't think even the paid courses can explain things with such great clarity.

    • @bebrave486.
      @bebrave486. Před 6 měsíci

      Hi
      I just wanted to know Have you tried to run the code because while downloading that 1.5 gb file I'm getting error please let me know if you are able to download

  • @amansahu9898
    @amansahu9898 Před 4 měsíci

    I really appreciate your work, i love to watch your videos. I found a small mistake at 54:15, it should be Small Dataset - SkipGram and for Large Dataset - CBow.
    Thanks

  • @vybhavish
    @vybhavish Před 9 měsíci

    I can say your way of explaining the concepts is tooooo good and easy to understand. Your content is best among most of the youtube channels. Hope your videos reach larger audience in future👍

  • @hemangdhanani9434
    @hemangdhanani9434 Před 2 lety +1

    Amazing Explanation bro, superb . clear complete concepts about Word2vec, thanks for making such a great content ...

  • @fantasymarketer
    @fantasymarketer Před 4 měsíci

    In a simple word, this tutorial is easy, easy, and easy....♥♥

  • @user-py6rr9bg9o
    @user-py6rr9bg9o Před 8 měsíci

    Very nicely explained. Many concepts got cleared in a single video.

  • @amitkumar2005
    @amitkumar2005 Před 4 měsíci

    Nitish Your explanations are mind boggling ! Superb !!

  • @rituarya1004
    @rituarya1004 Před rokem

    not having words for this brillant explanation

  • @viratmani7011
    @viratmani7011 Před 2 lety +2

    Waiting for this lecture 😍😍❤

  • @saimen4020
    @saimen4020 Před 2 lety +1

    game of thrones is my fav serias at 13 year

  • @alishaw9999
    @alishaw9999 Před 2 lety +1

    Amazing, you wonderfully explained all aspects of the topic

  • @navinpandey1309
    @navinpandey1309 Před 2 lety +1

    Maza aaya.You are amazing...

  • @anishkhatiwada2502
    @anishkhatiwada2502 Před 5 měsíci

    Nice explanation, I am got clearity on word2vec and other techniques now.

  • @pramodshaw2997
    @pramodshaw2997 Před 2 lety +1

    very informative

  • @krishnabhadke6161
    @krishnabhadke6161 Před měsícem

    Perfectly explained

  • @ravinarang6865
    @ravinarang6865 Před rokem

    I am Glad i found you on CZcams :) Lot of power to your work!!

  • @HamzaShahzadEB_
    @HamzaShahzadEB_ Před 8 měsíci

    This series is extremely helpful!

  • @zkhan2023
    @zkhan2023 Před 2 lety +1

    THANK YOU SIR JEE

  • @rambaldotra7259
    @rambaldotra7259 Před 7 měsíci

    Rarest knowledge ❤

  • @riyatiwari4767
    @riyatiwari4767 Před 8 měsíci

    Please continue this course.
    Please cover topics like named entity recognition, topic modelling, sentiment analysis etc

  • @surjeetbasu797
    @surjeetbasu797 Před 8 měsíci

    very well explained Sir!

  • @shahmuhammadraditrahaman9904

    Best lecture ❤️

  • @ppsheth91
    @ppsheth91 Před 2 lety

    Thanks Nitish for the detailed explaination..!😀

  • @utkar1
    @utkar1 Před 2 lety +5

    thanks Nitish for this, loving the series!! 2 queries
    1. GOT example me hamne skip-gram use kiya ya cbow?
    2. while selecting cbow vs skipgram, how do can we decide how much data is large/small data?

  • @pavangoyal6840
    @pavangoyal6840 Před 9 měsíci

    Excellent..! Please continue this playlist

  • @joyeetamallik5063
    @joyeetamallik5063 Před 2 lety +2

    Sir please think of taking some Online NLP classes.. where we can complete the course in fast track. These videos are very useful..

  • @atulnagane9072
    @atulnagane9072 Před 2 lety +1

    nice explained sir

  • @whothefisyash
    @whothefisyash Před měsícem

    53:45 small correction , Skipgram with smaller data and CBOW with larger data

  • @asifpervezpolok2243
    @asifpervezpolok2243 Před 2 lety +1

    Great 😍

  • @monishnigam5857
    @monishnigam5857 Před rokem

    You have made my day. 🤩

  • @user-hi4rl3rw4v
    @user-hi4rl3rw4v Před 5 měsíci

    brilliant explanation

  • @jatinverma5503
    @jatinverma5503 Před 7 měsíci +1

    Can you please confirm the writting pad that you use for the videos. Thanks in advance.

  • @ali75988
    @ali75988 Před 9 měsíci

    O Beeeeeeeeeeen stokes........maza a gya hai.

  • @imamasafeer8772
    @imamasafeer8772 Před 5 měsíci

    Sir, please make a complete playlist in Computer Vision as well...Eagerly waiting

  • @utkarshtripathi9118
    @utkarshtripathi9118 Před 2 lety

    thank u so much for this video
    🙏🙏👌👌😊😊

  • @ADITYASHARMA-mv1yd
    @ADITYASHARMA-mv1yd Před 11 měsíci +1

    Sir Please continue this playlist.. Please make video on transformers , BERT etc 😵

  • @mihirnaik3383
    @mihirnaik3383 Před 2 lety +8

    Stopped following Krish Naik after started watching your videos 😄, Great content!
    Refering many friends to follow your playlist 💪🏽
    AttributeError: 'Word2VecKeyedVectors' object has no attribute 'get_normed_vectors'
    Getting this error while executing " model.wv.get_normed_vectors() " , any suggestion brother?

    • @mihirnaik3383
      @mihirnaik3383 Před 2 lety +1

      Problem Solved:
      my_dict = dict({})
      for idx, key in enumerate(model.wv.vocab):
      my_dict[key] = model.wv[key]
      X = list(my_dict .values( ))
      y = list(model.wv.vocab.keys())

    • @joyeetamallik5063
      @joyeetamallik5063 Před 2 lety +2

      True

    • @ronylpatil
      @ronylpatil Před 2 lety +2

      Same with me bro.

    • @sidindian1982
      @sidindian1982 Před rokem +2

      Codebasics + Krish Naik is doing CCP job noting else .. just hype only sound no proper content nor flow 😏😏😏

  • @llllllllllllllllllllIll
    @llllllllllllllllllllIll Před 2 lety +2

    First of all, thanks for all these amazing videos these really help so much, bas mera ek sawal tha ki is this NLP playlist enough, i mean does it cover entire NLP or is there anything more which i should explore by myself, please sir bas yeh ek doubt clear kardo bohot help ho jayega

    • @campusx-official
      @campusx-official  Před 2 lety +2

      czcams.com/video/PKv_okm1H-k/video.html

    • @llllllllllllllllllllIll
      @llllllllllllllllllllIll Před 2 lety +1

      @@campusx-official thanks a lot sir, i completed the entire playlist and then am watching this roadmap now 😅

    • @sidindian1982
      @sidindian1982 Před rokem

      @@campusx-official Sir, data execution on Jupyter notebook taking more time😞

  • @ashishsom3849
    @ashishsom3849 Před 2 měsíci

    Thank you so much!

  • @miteshmungse5251
    @miteshmungse5251 Před 2 lety +3

    Kyuki sir hum aapke yaha Sikh to rahe hai but sir actually interview Kai liye explaination Kai upar khas kar Kai how to explain a project khas to ml Wale so is pai ek video banate hai aap to bahut help ho jayegi sir

  • @NavinKrGoyal
    @NavinKrGoyal Před 2 lety +2

    hello Nitish sir
    I am a fan of your machine learning series and I continue watching. my question is
    (1) model.wv.index_to_key and
    (2) model.wv.get_normed_vectors()
    are not working

  • @devashrichaudhari6875
    @devashrichaudhari6875 Před 2 lety +1

    According to wikipedia, CBOW is much faster than Skip-grams, then how can we use skip-grams for large dataset?

  • @yasharya8228
    @yasharya8228 Před 2 lety +1

    sir please make a video on object detection model

  • @tusarmundhra5560
    @tusarmundhra5560 Před 9 měsíci

    awesome

  • @vigneshls8590
    @vigneshls8590 Před 2 lety +1

    Sir
    1. Why have we used sent_tokenize ? Can't we just tokenize individual words directly ?
    2. Which way was the model created - cbow or skipgram ?
    3. Could you add links to your deep learning lectures that you kept referring in the video

  • @jenilsaliya3769
    @jenilsaliya3769 Před rokem

    Sir your content and explanation is mind blowing. so thank you sir
    one request is,
    sir can you provide your One-Note notes file so that we can refer from it when we needed.

  • @AIBoxed
    @AIBoxed Před 4 měsíci

    loved it

  • @arjunsingh4434
    @arjunsingh4434 Před 3 měsíci +1

    super sir

  • @ATHARVA89
    @ATHARVA89 Před 2 lety +1

    Nitish , I wanted to know whether judiciously following the assignments will do the thing for us , or do we need to study the NLTK, Spacy libraries in detail. Just asking !

  • @muhammadsajad2230
    @muhammadsajad2230 Před rokem

    Also, make a video on self supervised learning for computer vision applications.

  • @hamzasabir6480
    @hamzasabir6480 Před 9 měsíci

    In CBoW section, I think dimensions are wrong as total words are 5 and weights receiving from prior layer (hidden layer) are 3. So, dimension of last layer should be 5x3.

  • @alroygama6166
    @alroygama6166 Před 2 lety +1

    Sir can u do one on Burt, please

  • @sayanroy281
    @sayanroy281 Před 2 lety +2

    Sir, I have doubts.
    1. Why there are differences between Upper Case and Lower Case words? If we see the two vectors of 2 words "bag" and "Bag", the 2 vectors are different. 2. Why there is only one hidden layer?
    And one more thing. Google has its own GCP. Then google stored the file "GoogleNews-vectors-negative300.bin.gz" on the AWS S3 bucket. Why?

    • @campusx-official
      @campusx-official  Před 2 lety

      1. Because each character has different binary values.
      2. Creators must have tried different architectures.
      3. It's not Google who have hosted it. It's the creator of gensim I guess.

  • @technicalhouse9820
    @technicalhouse9820 Před 2 měsíci

    Love you sir

  • @farhadkhan3893
    @farhadkhan3893 Před rokem

    Amazing

  • @sahibnoorsingh2432
    @sahibnoorsingh2432 Před rokem

    please rephrase I loved your videos, please make a video on transformer too

  • @viratmani7011
    @viratmani7011 Před 2 lety +2

    Sir please share the links for basics of nueral networks.

  • @749srobin
    @749srobin Před 2 lety +1

    Now this googlenews link has only 94 entries , any different link?

  • @piyushpathak7311
    @piyushpathak7311 Před 2 lety +2

    Sir please complete DBSCAN and xgboost algorithm and anomaly detection in ml playlist PLZZ 🙏 Sir

  • @SurajitDas-gk1uv
    @SurajitDas-gk1uv Před 7 měsíci

    Thanks sir :)

  • @user-mf6vv5lc5h
    @user-mf6vv5lc5h Před měsícem

    AT 12:38 IN THE VIDEO THE GIVEN URL IS NOT WORKING NOW? PLEASE PROVIDE SOME SOURCE TO GET DATASET OF GOOGLE NEWS 300

  • @ameybikram5781
    @ameybikram5781 Před rokem

    same word can be present in different sentences ? so for each sentence we calc a word vector and take average?????????

  • @matindram
    @matindram Před rokem +1

    That amazon aws google news vector file is giving 404 Not Found error in colab notebook.

  • @pranav283
    @pranav283 Před 6 měsíci

    Why did you take 10 nodes in neural network input?

  • @miteshmungse5251
    @miteshmungse5251 Před 2 lety +1

    Sir how to explain Ml project interview Kai kiye uspai ek video banao na

  • @byotikram4495
    @byotikram4495 Před rokem

    Thank you so much Sir for this nicely explained tutorial. I have just one doubt Sir. If a sentence is having only 3 words, but while training the model on GOT data at last, we have considered window size 10. So what are the context word it will consider other than it's two context word ? (Sir, Is it like or can we assume like considering the whole corpus as one single sentence with all the words in the vocabulary ?)

  • @KumR
    @KumR Před 5 měsíci

    Wow

  • @ansiyaabdulrazak
    @ansiyaabdulrazak Před 6 měsíci

    That amazon aws google news vector file is giving 404 Not Found error in colab notebook. how can i solve this to see that file?? anybody please help me

  • @PIYUSH-lz1zq
    @PIYUSH-lz1zq Před 2 lety +1

    sir , health care website please

  • @tanishkarawat5266
    @tanishkarawat5266 Před 8 měsíci +1

    The link of google data isn't working, right? at 12:39

  • @usmanyousaaf
    @usmanyousaaf Před rokem

    ERROR: get_normed_vectors
    AttributeError: 'Word2VecKeyedVectors' object has no attribute 'get_normed_vectors'

  • @prashantranjan7146
    @prashantranjan7146 Před měsícem

    where i can find the link of google news of aws?

  • @GauravPatinge
    @GauravPatinge Před rokem

    sir linear function use in hidden layer in cbow and skipgram
    ?????

  • @anupriyasharma9214
    @anupriyasharma9214 Před rokem

    'Mera machine rone lag gaya' 😂

  • @varunahlawat9013
    @varunahlawat9013 Před rokem

    Sir, through which parameter can we decide between CBOW vs Skip-Gram?

  • @virajpatel3382
    @virajpatel3382 Před rokem

    Hi i was having issues with getting the google collab to work I consistently got error not found when trying to get the google pre-existing model

  • @munishmangla9998
    @munishmangla9998 Před rokem

    good evening sir
    kindly provide the updated link of the data set
    please its humble request
    to do practice