A friendly introduction to Bayes Theorem and Hidden Markov Models

Sdílet
Vložit
  • čas přidán 29. 08. 2024
  • Announcement: New Book by Luis Serrano! Grokking Machine Learning. bit.ly/grokkingML
    40% discount code: serranoyt
    A friendly introduction to Bayes Theorem and Hidden Markov Models, with simple examples. No background knowledge needed, except basic probability.
    Accompanying notebook:
    github.com/lui...

Komentáře • 692

  • @pauldacus4590
    @pauldacus4590 Před 5 lety +222

    Happy I found this video.. even though it was rainy outside

    • @kebman
      @kebman Před 4 lety +5

      Happy I found this video.. even though there's a Corona lockdown :D

    • @pqppd8491
      @pqppd8491 Před 4 lety

      It's coincidentally rainy outside 😂

    • @a7md944
      @a7md944 Před 3 lety

      Based on previous experiance, because it is rainy at your side, i predict that you were probably not happy 😔

    • @TymexComputing
      @TymexComputing Před 2 měsíci

      @@a7md944 Bob was more likely not happy, we are the hidden state - whats the probability that the lockdown was not justified and that people were dying because of lack of medical help instead of the illness.

  • @somdubey5436
    @somdubey5436 Před 3 lety +12

    your are one of those rarest breed of gifted teachers

  • @csejpnce2585
    @csejpnce2585 Před 6 lety +20

    Usually Bayes Theorem and HMM are nightmare to even researchers. In this video these nightmares are made like child's play. I'm highly thankful for this service you are providing to the academic community- teachers, researchers and students. Keep it up Luis Serrano and hope to see many more plays in future!!!

  • @simpleprogramming4671
    @simpleprogramming4671 Před 6 lety +84

    wow. perfect explanation . Even a kid can learn HMM by watching this video

  • @codebinding71
    @codebinding71 Před 6 lety +207

    Your video tutorials are a great breakdown of very complex information into very understandable material. Thank You. It would be great if you could make a detailed video on PCA, SVD, Eginvectors, Random Forest, CV.

    • @jacobmoore8734
      @jacobmoore8734 Před 5 lety +7

      Eigenvectors and SVD for sure.

    • @ericbauer6595
      @ericbauer6595 Před 4 lety +11

      @@jacobmoore8734 check out 3blue1brown's channel for the Essence of Linear Algebra. He explains that matrices are linear functions like y=f(x) or like a line 'y=mx', with y-intercept b=0. Eigenvectors are special inputs 'x' such that f(x) = kx, where k is some scalar coefficient (k is the eigenvalue associated with the special input x).
      For certain types of NxN matrices, (the covariance matrix used in PCA for example) are super interesting because any point in N-dimensional coordinates can be represented as a linear combination (ax1 + bx2+...) of the eigenvectors. The eigenvectors form a 'basis' for that space. This is where SVD (singular value decomposition) comes in. SVD essentially asks "instead of just multiplying x by your matrix, why don't you decompose this task into 3 easier tasks?" Let's say your matrix is C for covariance. Then SVD says that C = ULU' where U is made up of the eigenvectors for columns, U' is the transpose of U, and L is a diagonal matrix with the eigenvalues.
      Pretend we're doing y = C*x. Then first we do w = U'*x. This essentially represents x as a linear combination of eigenvectors. Said another way, you've changed the representation of point x from the original coordinate system to the eigenvector coordinate system. Next we do z = L*w, which scales every value of vector w by an eigenvalue. Some of these eigenvalues are very small and the result in z is perhaps closer to 0. Some of these eigenvalues are relatively large and upscale the result in z. Finally, when you do y = U*z, all you're doing it translating your scaled z vector back into the original coordinate system.
      So SVD basically splits a matrix into 3 different operations:
      1. represents an input vector in terms of eigenvector coordinates
      2. scales each coordinate by an eigenvalue
      3. represents the scaled result back in terms of the original coordinates
      When you look at PCA (principal components analysis), you take your covariance matrix and decompose it to look at how much your eigenvalues scale the eigenvector coordinates. The largest eigenvalues correspond to the direction (eigenvector) of largest variation

    • @noduslabs
      @noduslabs Před 4 lety

      Definitely eigenvectors! Please!

    • @kapil_vishwakarma
      @kapil_vishwakarma Před 4 lety

      Yes, please, do that.

    • @SaptarsiGoswami
      @SaptarsiGoswami Před 4 lety

      You may have already found some, this is an attempt by University of Calcutta, not so coolly done, but please see if it makes sense czcams.com/video/C6fH5Nfoj40/video.html

  • @Slush_
    @Slush_ Před 4 lety +14

    You have just saved me, this was such a clear breakdown of Bayes Theorem and HMMs, and exactly what I needed at the 11th hour of a project I'm working on!

  • @shuchitasachdev9310
    @shuchitasachdev9310 Před 5 lety +4

    This is the best description of this topic I have ever seen. Crystal clear! True knowledge is when you can explain a complex topic as simple as this!

  • @AbeikuGh
    @AbeikuGh Před 3 lety +2

    I was quite tensed when my supervisor pointed out to me that my master thesis should incorporate HMM. This video is my first introduction to HMM. You chased my fears away with your simple explanation and tone. Forever grateful

    • @carlosmspk
      @carlosmspk Před 3 lety +1

      Similar situation here, I have a master thesis in anomaly detection, and using HMM is a candidate. I'm afraid it's much more complicated than this, but it sure made it look less scary

  • @danielking7988
    @danielking7988 Před 6 lety +4

    Your videos are amazing! As someone who hasn't looked at calculus in 20 years, I find these "friendly introduction" videos extremely helpful in understanding high-level machine learning concepts, thank you! These videos really make me feel like this is something I can learn.

    • @generationgap416
      @generationgap416 Před rokem

      Isn't this opposite of calculus? Discrete vs continuous functions.

  • @pratiksharma1655
    @pratiksharma1655 Před 5 lety +2

    I wasted the whole day understanding HMM model by watching useless youtube videos, untill I saw this. Thank you so much for this video. It is so simple and so intuitive. So very thankful to you :)

  • @chenqu773
    @chenqu773 Před 3 lety

    The most exciting thing I found in your video is that most of them is a one-stop solution for dummies like me, without the need to go to other 100 places to find 50 missing info pieces. Many thanks !

  • @me-zb7qm
    @me-zb7qm Před 6 lety +1

    I have a midterm in 8 hours and this video is the only thing that's really helped me so far. Cleared up all my confusions during 8 lectures in 32 minutes. Thank you so much, from the bottom of my heart.

    • @SerranoAcademy
      @SerranoAcademy  Před 6 lety

      Thank you for your note, I hope the midterm went great! :)

  • @LizaBrings
    @LizaBrings Před 5 lety +7

    Omg. You just replaced an entire dry, non-understandable book for bioinformatics! I can’t thank you enough! It’s so easy!

  • @BabakKeyvani0
    @BabakKeyvani0 Před 5 lety +7

    Thank you so much for this great video Luis. I am a Udacity alumni myself. I have watched & read many videos and articles on Bayes & HMMs, but your video by far is the best. It explains all the steps in the right amount of detail & does not skip any steps or switch examples. The video really helped solidify the concept, and giving the applications of these methods at the end really helps put them in context. Thank you again very much for your information & helpful video.

  • @changyulin47
    @changyulin47 Před 6 lety +14

    OMG! you are amazing! I consider myself as a information theory guy and should know this pretty well. But I can never present this idea as simple and easy understanding as you did! Great great job! I will for sure check around your other videos! Thank you!

  • @PinaTravels
    @PinaTravels Před 3 lety +4

    This has taken me from 0 to 80% on HMM. Thanks for sharing

  • @johnpetermwangimukuha
    @johnpetermwangimukuha Před rokem +1

    Man Bayesian Theory has been having me for Breakfast! Thank you for this tutorial!

  • @noduslabs
    @noduslabs Před 4 lety +9

    Beautiful work! It’s the most accessible introduction to Bayes inference I’ve seen. Great job! Please, keep them coming!

  • @bhanukadissanayake9988
    @bhanukadissanayake9988 Před 3 lety +3

    A great explanation! He used 16 items 6:28 to calculate transition probabilities and 15 items to emission probabilities 8:09 . Did anyone notice that? :)

    • @wp1300
      @wp1300 Před rokem

      Yes, I noticed that. so the results in this demonstration are wrong.

  • @theavo
    @theavo Před 4 lety +2

    I'm on a streak of watching your third video in a row and instantly liking it for outstandingly easy-to-understand breakdown of a quite complex topic. Well done, sir, I'll visit your channel in the future for sure! ✨

  • @kassymakhmetbek5848
    @kassymakhmetbek5848 Před 4 lety

    I wish professors would just show this video in lectures... You are great at making these animations and your speech is perfect. Thank you!

  • @sametcetin9235
    @sametcetin9235 Před 5 lety +32

    Hi Luis, thank you for your friendly introduction. When I was studying on an assignment and trying to implement Viterbi method following your explanation, I noticed that there may be some mistakes on your calculations. You calculated best path starting from the beginning (from leftmost side) and select the weather condition (sunny or rainy) with the max value. However, I am not sure if that this is the correct way to apply Viterbi. You don't mention anything about backpointers.
    I reviewed HMM chapter of Speech and Language Processing by Dan Jurafsky. Here, it is stated that to find best path we should start from the end (from rightmost side) First we should select the weather condition with the max probability (that is actually the last node of our visiting path. we find the full path in reverse order) Then we should do a backward pass and select the weather condition which maximizes the probability of the next condition that we have just selected instead of just by looking for the max probability among all conditions at that observation time. We continue this process until we reach the beginning.
    Two things to emphasize;
    1- We go backward. (from end to start)
    2- We don't just select the weather conditions with maximum probabilities at specific observation times, instead we select the max one only once at the beginning and then select the conditions that maximizes the one that comes after it, like a chain connection.
    If I am wrong, please enlighten me.
    Best.

    • @priyankad7674
      @priyankad7674 Před 4 lety +3

      You are right

    • @arikfriedman4442
      @arikfriedman4442 Před 3 lety

      I agree. seems there is a minor mistake there. choosing the "best" probability in each day doesnt ensure the optimal path we are looking for. If I understand correctly, you should start from the end, looking for the best final probability. then go "backwards", looking for the specific path which led to this final probability.

  • @viro-jx2ft
    @viro-jx2ft Před 3 měsíci

    This is the best ever video you will find on HMM. Complicated concepts handled soooo wellll🥰

  • @mrinmoykshattry527
    @mrinmoykshattry527 Před 3 lety

    This is the best video that explains HMM so simply to someone who doesn't have a computer science background. Godspeed to you

  • @muhammadyousuf2828
    @muhammadyousuf2828 Před 4 lety

    I am a bio-organic chemist and we have a bioinformatics course which included Hidden Markov Model and your video helped me to learn the idea without immersing myself deep into mathematics. Thanks ...

  • @freeustand
    @freeustand Před 5 lety +6

    At around 29:00, you say that "all we have to do is to pick the largest one each day and walk along that path," picking the "S-S-S-R-R-S" sequence as the best one. But my calculation shows that "S-S-R-R-R-S" sequence actually gives a better likelihood for the observed sequence of Bob's mood. I think what we have to do is not "just to pick the largest one each day and walk along that path," but "to pick the sequence of weathers that eventually led to the larger one at the last day." Please correct me if I'm wrong. Anyway, this video is super helpful! Thanks a lot!

    • @mehransh7753
      @mehransh7753 Před 5 lety +1

      I agree with you, Junghoon. I reach the same conclusion. and I think the best way is to actually register the path we came for calculating each maximum value. and at the end, we can start with maximum and print the road that we had registered to print result. or instead of using memory, as you said we can calculate to see which one is matching along the path from the maximum value on the last day to reach to the start day.

  • @i.d432
    @i.d432 Před 4 lety +1

    What a clear way of teaching. You're a total Rockstar of teaching stats. Ok, let's do the Baum-Welch algo

  • @fanfanish
    @fanfanish Před 5 lety +1

    I can't believe how you did it so clear and simple. gorgeous

  • @aatmjeetsingh7555
    @aatmjeetsingh7555 Před 4 lety

    this example made everything crystal clear, I have an exam tomorrow on HMM. Initially, I was anxious but after this video I'm sure I can solve any problem.
    Thank you very much, sir.

  • @deveshmore3106
    @deveshmore3106 Před 3 lety

    As a feedback I would say your explanation is spot on .... A person with basic statistical knowledge can understand HMM with your explanation

  • @johnny_eth
    @johnny_eth Před 5 lety +1

    Very nice and concise explanation. The only thing lacking is that you did not deduce the Bayes theorem formula from the example, which is something any student will see over and over again.

  • @urcisproduction797
    @urcisproduction797 Před 6 lety +1

    You are the best explainer I have found in youtube till now! Great work!

  • @cybren2
    @cybren2 Před 4 lety

    Thank you so much for this video! I searched for hours, watched many videos, read many websites/ papers etc. but i never really understood what a HMM and the algorithms are and how they work. You explained everything from how it works to how to implement it so well that I got in 30 minutes, what i didnt get in hours before. Thank you so much!!

  • @AB-km5sp
    @AB-km5sp Před 5 lety +3

    The best explanation of HMM ever! Very visual and easy to grasp. Enjoyed learning so much. Thanks!
    Edit: Can you please do a friendly video on EM algorithm, too?

  • @at4652
    @at4652 Před 6 lety +1

    Top notch and best explanations. You are taking complex subjects and making it intuitive not an easy thing to do !

  • @StarEclipse506
    @StarEclipse506 Před 5 lety

    I took a probability class and did badly. After recently finding out I'd need to revisit it for machine learning, I was a bit concerned. Then I come to understand an algorithm for Baye's Theorem!! How incredible, thank you!!

  • @ipaliipali8804
    @ipaliipali8804 Před 6 lety

    Being a teacher myself for long time all I can say is that this video is awesome! You have a talent my friend.

  • @mahdiebrahimi1241
    @mahdiebrahimi1241 Před 4 lety

    best description about HMM, I had hard time to understand this topic, but your teaching keep me motivated for further learning.

  • @AnshumanDVD
    @AnshumanDVD Před 4 lety

    I am a first time viewer but with such kind of amazing explanations, I will always stick to your teaching, vow so nicely explained!

  • @user-de8ue5cs6s
    @user-de8ue5cs6s Před 4 lety +1

    my dad recommended i watch this, and i sure am thankful he did :D great video!

  • @vladimir_egay
    @vladimir_egay Před 4 lety

    Nice job! Best explanation by now. Explained 6 weeks of my class in 32 minuts!

  • @geovalexis
    @geovalexis Před 4 lety +1

    Simply amazing! After quite a long time struggeling to understand HHM now I finally get it. Thank you so much!!

  • @sintwelve
    @sintwelve Před 5 lety +1

    Thank you so much for this. I wish more educators were more like you.

  • @knasiotis1
    @knasiotis1 Před 2 lety

    You did a better job teaching this than my MSc

  • @sonasondarasaesaria1941

    Hi Luis Serrano thanks for the clear explanations, your informal way to explain this material is the best for us as a student, even my professor in Machine Learning class recommend this video for learning the HMM introduction!

  • @RC-bm9mf
    @RC-bm9mf Před 3 lety

    Dr Serrano, I think you are an embodiment of Feynman in ML education! Thanks a lot!!

  • @BrandonRohrer
    @BrandonRohrer Před 6 lety +39

    Great example, cleanly executed. Up to your usual high standards.

    • @SerranoAcademy
      @SerranoAcademy  Před 6 lety

      Thank you so much Brandon! Always happy to hear from you.

    • @pratiksingh9480
      @pratiksingh9480 Před 5 lety

      Luis , your way of teaching is so good , even a 10 year old will be able to understand such a complex topic.
      Will definitely check you book as well once my current stack is finished.

  • @eTaupe
    @eTaupe Před 4 lety

    Thanks to your videos, I save a huge amount of time. Focusing on the intuition and mechanic allows an instant understanding BEFORE delving into the maths

  • @SimoneIovane
    @SimoneIovane Před 4 lety

    I think it is the most clear explanation of HMM. A university course in 30 mins video

  • @girishtiwari79
    @girishtiwari79 Před 5 lety +1

    Great tutorial. While calculating transition probabilities, you have taken 3 sunny days at the end (4 sunny, 3 rainy, 4 sunny, 2 rainy and last 3 sunny), but to calculate probabilities of sunny and rainy without knowing Bob's mood, you have taken 2 sunny at the end. I think you taken last 3rd sunny day to loop back to first sunny day since we cannot start with sunny on our own. I think a cyclic representation will be better to clear the doubts it may raise.

  • @williamhuang5455
    @williamhuang5455 Před 3 lety

    As a high schooler, this video was very helpful and I understand HMMs a lot more now!

  • @milleniumsalman1984
    @milleniumsalman1984 Před 4 lety

    cheated on my work hours to watch this course, this was totally worth it

  • @castro_hassler
    @castro_hassler Před 5 lety +17

    This guy is amazing, hey bro, could you make a video comparing classical techniques like this one with RNN , which one is better generalizing , when to use one over the other? Thanks and keep it up!

  • @arbaazaattar6266
    @arbaazaattar6266 Před 6 lety +2

    Made my day...I learned Hidden Morkov Model for first ever time n guess wht? It was damn simple to understand the way explained.

  • @shapeletter
    @shapeletter Před 4 lety

    It was so nice with images! When you switched to letters, it was super clear how much easier it was to look at images!

  • @rephechaun
    @rephechaun Před rokem

    Thank you very much, I really like the way that you, initially, explain everything with emojis that's very relatable and easy to follow along, in my head. Others explain with coin,dice, and worst, greeks letters that make no real-life sense at all. Thank you thank you very much! really save me tons of time and headache.

  • @namnguyen7153
    @namnguyen7153 Před 4 lety +1

    Thank you so much! This video literally helps me understand 3 lectures in my machine learning class

  • @ramakalagara3577
    @ramakalagara3577 Před 3 lety

    You made it so ease for learners... Appreciate the time you are spending in creating the content!!

  • @anirudha_ani
    @anirudha_ani Před 2 lety

    This is hands down the best video on HMM.

  • @miroslavdyer-wd1ei
    @miroslavdyer-wd1ei Před 8 měsíci

    I explain Bayes with a horizontal event tree, like a decision tree. Very good job mr cerrano

  • @iglf
    @iglf Před 4 lety

    I was going thru HMMs for robot localization and found this super clear explanation. Eres un fenomeno, Luis. Gracias!

  • @CBausGB
    @CBausGB Před 3 lety

    It's impressing how simple you explain very complex issues! Thank you!!

  • @jessiestone99
    @jessiestone99 Před 5 lety +1

    I understand Hidden Markov Models for the first time!! Please teach us about forward backward algorithm for HMM. Thank you SO much!

    • @edthoreum7625
      @edthoreum7625 Před 4 lety

      "Hidden"?
      Depends on where Bob lives?
      In las vegas=super happy- 0% rain
      In Oregon =unhappy -90% rain

  • @vardaanbhave2231
    @vardaanbhave2231 Před 3 měsíci

    Dude, thanks a ton for explaining this so simply

  • @dardodel
    @dardodel Před 3 lety

    Out of 319k views (as of now), 1k come from me! Every time I watch this video, I learn something new. Best video, great instructor! Luis' videos always remind me of Albert Einstein quote: "If you can't explain it simply, you don't understand it well enough."

  • @arisweedler4703
    @arisweedler4703 Před 3 lety

    Thanks for the straightforward explanation of Bayesian networks + Hidden Markov Models. Cool stuff! Very powerful.

  • @boamixtura1155
    @boamixtura1155 Před 5 lety

    I Love You man. All maths should be explained like this. Easy and intuitive. I'm tired of sayin' it.

  • @georgikyshenko4380
    @georgikyshenko4380 Před 4 měsíci

    the best explanation on the internet. Thank you!

  • @anderswigren8277
    @anderswigren8277 Před 6 lety +1

    This is the best explanation of HMM i ever seen up to now!

  • @ImperialArmour
    @ImperialArmour Před 3 lety

    Thanks Luis, I was taught HMMC using speech recognition, but will be having case study test on robot vacuums using this. I really appreciate it.

  • @blz1rorudon
    @blz1rorudon Před 4 lety +1

    I can do nothing except to give my utmost respect to you, sir. Thank you so much for a fantastically eloquent explanation.

  • @SupremeSkeptic
    @SupremeSkeptic Před 6 lety

    Very comprehensive and easily understandable. Even though I get increasingly impatient to watch the whole thing, I still managed to swing the thumb up.

  • @Fdan36
    @Fdan36 Před 3 lety

    Really liked the video. Was looking to understand HMMs for neuron spiking and things are much clearer now.

  • @PALPABLEemotions
    @PALPABLEemotions Před 4 lety

    Excellent video, i remember looking at this on wikipedia and just not having a clue of what it meant, you did a fantastic job of explaining it!

  • @stephenhobbs948
    @stephenhobbs948 Před 4 lety

    Very easy to understand using Bob and Alice and the weather. Thanks.

  • @soliloquy2006
    @soliloquy2006 Před 4 lety +1

    Thanks so much for this! It really helped with a research report I'm writing. Clear and easy to understand and the pacing was excellent for being able to take notes.

  • @amyrs1213
    @amyrs1213 Před 3 lety

    Your videos are very helpful and giving a good intuition of complex topics :) many thanks from Siberia

  • @KhaledBarie
    @KhaledBarie Před 5 lety +8

    I think the path at 29:05 should be
    Sunny-Sunny-Rainy-Rainy-Rainy-Sunny
    because the 0.0147 rainy probability at the 4th day comes from the previous day being rainy
    I noticed this because an implementation of the algorithm that I found gave me those results so someone correct me If I'm wrong.
    Nevertheless a greaet video!

    • @TheCosmonaut1
      @TheCosmonaut1 Před 4 lety

      Yes I also think he is wrong. He missed to apply backtracking of the previous state which maximized the current state!

    • @darshitthakar7999
      @darshitthakar7999 Před 4 lety

      Can you explain your sequence. According to the tutor in the video the highest probability weather that day is given preference. By that logic I find his sequence to be correct.

    • @noagarnett
      @noagarnett Před 4 lety

      Indeed so. The backtrack part of the algorithm is missing. Really strange. I will move to a different tutorial, I'm afraid.

    • @KhaledBarie
      @KhaledBarie Před 4 lety +1

      ​@@darshitthakar7999 The backtracking part in the video is off. In the forward part you calculate the probabilities of each day being sunny/rainy in two scenarios: given that the previous day was rainy and given that the previous day was sunny. You then select the maximum value between the two but you also store which previous state gave that higher probability. When backtracking you start from the end, select the maximum probability of the two states and then backtrack which prevous assumption for the previous day gave you that probability, not just choose the maximum at each day.

  • @richardchaven
    @richardchaven Před 4 lety

    This finally made Bayes' method intuitive. Thank you

  • @nigerinja7195
    @nigerinja7195 Před 3 lety

    Thanks alot! I came across your video while searching for HMM-explanation for my computational biology course, and it helped a lot to understand the basic principle :)

  • @johnykariuki5005
    @johnykariuki5005 Před 4 lety +1

    surely the best video on HMM

  • @ashishgohil9717
    @ashishgohil9717 Před 4 lety

    Very nicely explained. It takes a lot to teach a complex topic like HMM in such a simplistic way. Very well done. Thank you.

  • @fuadmohammedabubakar9202

    Really amazing video that breaks down Bayes Theorem for simple understanding. Thanks Luis

  • @kimdinh8359
    @kimdinh8359 Před 4 lety

    This video is really useful for me to learn HMM as well as probability calculation with algorithms. The example is easy to understand. Thank you so much.

  • @OzieCargile
    @OzieCargile Před 3 lety

    Best video of its kind on CZcams.

  • @windandwolf
    @windandwolf Před 4 lety

    Love it! Please add Viterbi algorithm to the title. Your explanation to that is super easy to understand and follow. Thank you thank you thank you!

  • @hellomacha4388
    @hellomacha4388 Před 3 lety

    Very very nice and impressive explanation even a layman can understand this concept tq sir for keeping lot of effort for making this video

  • @balasahebgadekar425
    @balasahebgadekar425 Před 3 lety

    Excellent, excellent.
    Great job.
    Your all videos enlighning to all academicians

  • @dYanamatic
    @dYanamatic Před 4 lety

    Amazing ... I just bought your book from Australia. Thank you for your time and effort!!!

  • @avwillis
    @avwillis Před 5 lety

    a beautiful combination of all the difficult concepts in probability in one video. great job.

  • @vaibhavjadhav3453
    @vaibhavjadhav3453 Před 3 lety

    Thank you so much for this beautiful explanation. learned about application of Bayes and Markov together ...Would happy to see more engineering application of these thermos..

  • @jfister2481
    @jfister2481 Před 6 lety

    Best explanation of Hidden Markov Models on the Internet. Well done.

  • @spikersdoom4300
    @spikersdoom4300 Před 4 lety

    This video is too good to be free.
    If you can't understand from this video, then probably you can't understand from anywhere else. :)

  • @alaraerenel9082
    @alaraerenel9082 Před 4 lety

    It was such a clear explanation of the concept, and it really helped me to understand. I have only one recomendation: you may explain the applications more deeply. That would make the concept more understandable especially if we want to use it.

  • @mitchellphiri5054
    @mitchellphiri5054 Před 5 lety

    So I always just saw posts about HMM and I just decided to give your video a try and the explanations are just so fluid, I'm interested now

  • @martadomingues1691
    @martadomingues1691 Před 2 měsíci

    Very good video, it helped clear some doubts I was having with this along with the Viterbi Algorithm. It's just too bad that the notation used was too different from class, but it did help me understand everything and make a connection between all of it. Thank you!

  • @meshackamimo1945
    @meshackamimo1945 Před 5 lety +3

    And please do a video on the baum-welch algorithms.
    Once again, no words to thank you!
    Happy New year!

  • @gravitycuda
    @gravitycuda Před 5 lety +1

    Wow, that has changed a lot in my understanding of Bayesian inference. Your voice sounds like the guy from Passenger.

  • @shreyaspradhan8546
    @shreyaspradhan8546 Před 3 lety

    Holy shit this was so intuitive. I'm so glad I found this. Best resource to learn about HMM

  • @dennishuang3498
    @dennishuang3498 Před 5 lety

    So great by using sample example to explain confusing yet very important topics! Appreciate your excellent tutorial!