Hidden Markov Model Clearly Explained! Part - 5

Sdílet
Vložit
  • čas přidán 28. 08. 2024
  • So far we have discussed Markov Chains. Let's move one step further. Here, I'll explain the Hidden Markov Model with an easy example. I'll also show you the underlying mathematics.
    #markovchain #datascience #statistics
    For more videos please subscribe -
    bit.ly/normaliz...
    Markov Chain series -
    • Markov Chains Clearly ...
    Facebook -
    / nerdywits
    Instagram -
    / normalizednerd
    Twitter -
    / normalized_nerd

Komentáře • 261

  • @cmfrtblynmb02
    @cmfrtblynmb02 Před rokem +199

    I got my PhD back in 2000s. I wish material like this existed back then. I struggled so much with a lot of basic concepts. I had to look at material written in the most extreme mathematical notation without gaining any intuition. You have done an amazing job explaining something I really had hard time grasping. Well done.

    • @amrdel2730
      @amrdel2730 Před rokem +1

      yeah phd level is almost all time in use or link to some machine learnning theorie understanding i doin it now

    • @kurghen988
      @kurghen988 Před 9 měsíci +1

      I thought exactly the same!

    • @HeartDiamond_variedad
      @HeartDiamond_variedad Před 8 měsíci +1

      En español no hay contenido así, tenemos que recurrir a videos en inglés

    • @stryker2322
      @stryker2322 Před 3 měsíci

      because of people like you we alll are able to go deep into artificial intelligence and able to simplify what we are watching right now

    • @RkR2001
      @RkR2001 Před 2 měsíci

      Great teaching Nerd. Where are You based now? ,I am a doctor from INDIA need your academic help

  • @pradipkumardas2414
    @pradipkumardas2414 Před 2 lety +29

    I am 60 by age. I am learning how to simplify a complex subject. You are a great teacher! God bless you

  • @gracemerino3267
    @gracemerino3267 Před 3 lety +53

    Thank you for making these videos! I am starting my master's thesis on Markov Chains and these videos help me get a solid introduction to the material. Much easier to understand the books now!

    • @abdullahalsulieman2096
      @abdullahalsulieman2096 Před 2 lety

      That is great. How is it so far? I want to apply it to predict system reliability. Not sure how complicated it is. Could you advise please?

  • @k.i.a7240
    @k.i.a7240 Před 24 dny

    Better explained than over 3 AI university classes I have gone through. Simple, efficient. Thank you

  • @adaelasm6467
    @adaelasm6467 Před 2 lety +2

    This might be the fastest I've gone from never having used a concept to totally grokking it. Very well explained.

  • @mienisko
    @mienisko Před 10 měsíci +1

    I'm just going through the markov chain playlist and because of its quality I'm gonna subscribe to this channel. Great material!

  • @HKwak
    @HKwak Před 3 lety +11

    Thanks for videos. I am majoring in Data Science and obviously videos like this sometimes help enormously comparing to reading texts. Very Intuitive and visual. I don't think I will forget the weather signs you showed us today.

    • @NormalizedNerd
      @NormalizedNerd  Před 3 lety

      That's the goal 😉

    • @Portfolio-qb7et
      @Portfolio-qb7et Před 11 měsíci

      ​​@@NormalizedNerdCan you briefly explain the steps involved in finding the probability of sunny day..I really don't understand

  • @IndianDefenseAnalysis
    @IndianDefenseAnalysis Před rokem +2

    What a channel. Have never came across any Data Science channel like yours. You are doing a fantastic work. Love your videos and going thru them ❤

  • @mertenesyurtseven4823
    @mertenesyurtseven4823 Před rokem +6

    Dude your videos are severly underappreciated. The animations, basic but complete style of you used when talking about abstract and complex topics. I just discovered this channel, and I will recommend all the people including beginners because even they will be able to understand as well as advanced people.

  • @mayurdeo627
    @mayurdeo627 Před 3 lety +10

    Amazing job. This really helps if you are preparing for interviews, and want a quick revision. Thank you for doing this.

  • @aramisfarias5316
    @aramisfarias5316 Před 2 lety +5

    After going through your Markov chains series, you my friend got yourself a new subscriber! Great work. Your channel deserves to grow!

  • @dheerajmakhija9752
    @dheerajmakhija9752 Před 8 měsíci +1

    Very rare topic explained in most precise manner. Very helpful 👍

  • @songweimai6411
    @songweimai6411 Před rokem +2

    Thank you, really appreciate your work. Watched this video let me consider that my professor in school class is not a good teacher.

  • @Mosil0
    @Mosil0 Před 2 lety +20

    Hey, thanks a lot for making these!
    One suggestion if you don't mind: you could avoid using red and green (especially those particular shades you used) as contrasting colors, given that they're close to indistinguishable to about 8% of males. Basically any other combination is easier to tell apart, e.g. either of those colors with blue.
    Just a minor quibble, the videos are otherwise very good!

    • @NormalizedNerd
      @NormalizedNerd  Před 2 lety +10

      Thanks a lot for pointing this out. Definitely will keep this in mind.

  • @josippardon8933
    @josippardon8933 Před rokem +1

    Great! Just Great! I really dont understand why most professors at colleges hate this type of explaining things. They always choose standard "lets be super formal and use super formal mathematical notation". Yes, it is important to learn formal mathematical things, but why not combine both formal and informal approach and put them together in textbook?

  • @sreelatharajesh2365
    @sreelatharajesh2365 Před 11 měsíci +1

    Wonderful video. Amazing explanation. Please explain why P(Y) is neglected? Or is considered as 1?

  • @ivanportnoy4941
    @ivanportnoy4941 Před 2 lety +9

    Loved it! I am looking forward to (maybe) seeing a video on the Markov Chain Monte Carlo (MCMC) algorithms. Best regards!

  • @muneshchauhan
    @muneshchauhan Před 2 lety +7

    Many thanks for the beautiful visualization and summarization of the Markov model. It was effortless understanding it. May require a little revision but comprehensible with ease. 🙂

  • @Hangglide
    @Hangglide Před 3 měsíci

    Hello author from the past! Your video is really helpful! Thank you!

  • @hyeonjusong1159
    @hyeonjusong1159 Před rokem

    You have a knack for teaching! The explanation was clear and the example along with the emojis was so cute! Thank you!!!

  • @zoegorman8233
    @zoegorman8233 Před 3 lety +6

    Super entertaining videos helping me with my Oxford master's thesis. Study night or movie night? Plus he has an awesome accent :-)

  • @brainnuke5450
    @brainnuke5450 Před 3 lety +2

    Don't stop what you are doing! It's amazing.

    • @NormalizedNerd
      @NormalizedNerd  Před 3 lety

      Thanks!! :D

    • @asaha9479
      @asaha9479 Před 3 lety

      From the accent u r a bengali but r u from isi? GREAT vdo,keep going

    • @NormalizedNerd
      @NormalizedNerd  Před 3 lety

      @@asaha9479 Yup I'm a bong...But I don't study in ISI.

  • @gregs7809
    @gregs7809 Před 3 lety +6

    A really well laid out video. Looking forward to watching more

  • @philipbutler
    @philipbutler Před 3 lety +3

    this was the perfect video for where I'm currently at. I learned about Markov chains last year, and just finally got a good grasp on Bayes' (I struggled through Prob & Stats years ago). Thanks so much! keep it up!

  • @debasiskar4662
    @debasiskar4662 Před 2 lety +2

    You explain things nicely. I would request you to make videos on advanced stochastic processes like semi-Markov Process, martingale etc.

  • @sye9522
    @sye9522 Před 4 měsíci

    Amazing!! It really helps me understand the logic behind those scary HMM python codes. Thank you.

  • @noame
    @noame Před rokem +4

    Thank you for the clarity of explanation. Why did you neglect the denominator P(Y). How can we calculate it ? I assume that the correct arg max should take into consideration the denominator P(Y)

    • @Ujjayanroy
      @Ujjayanroy Před 7 měsíci

      we are taking argmax of a function with X as the variables, so Y doesnt matter because of argmax...you can refer to bayes theorem for maximum likelihood, they always do the same thing

  • @girishthatte
    @girishthatte Před 3 lety +4

    Awesome explanation ! Loved the way you explained Math used for calculations !

  • @tatvamkrishnam6691
    @tatvamkrishnam6691 Před 2 lety

    Excellent! Skipped 4th part of this Magnificient Markov series. Took roughly 3hrs to verify at moments and coninvce myself. HIT MOVIE!!

  • @ViralPanchal97
    @ViralPanchal97 Před 4 měsíci

    Thanks a ton, I wish my professors from Monash Uni taught this way.

  • @adrenochromeaddict4232

    you are one of the rare indian youtubers that don't have a god awful incomprehensible accent and you're also good at teaching. congrats

  • @Louisssy
    @Louisssy Před 3 lety +5

    Really cool explanation! Can you also explain why is P(Y) ignored?

    • @NormalizedNerd
      @NormalizedNerd  Před 3 lety +10

      Because of two reasons...
      1. It's often hard to compute P(Y)
      2. To maximize that expression we only need to maximize the numerator (depends on X). Note that P(Y) doesn't depend on X.

    • @ts-ny6mx
      @ts-ny6mx Před 3 lety

      @@NormalizedNerd Thank you! I had the same concern and your explanation makes sense!

    • @mariamichela
      @mariamichela Před 2 lety

      @@NormalizedNerd 1. can't we just compute P(Y) as P(X1,Y) x P(X1) + P(X2,Y) x P(X2) + P(X3,Y) x P(X3) ?
      2. true, I agree. Since you didn't say it in the video I was just wondering where did P(Y) disappear, and didn't bother to think that the max was actually over X

  • @artemiosantiagopadillarobl4280

    Great videos, keep it up! :)
    I would be nice to have a video about MCMC (Markov Chains via Monte Carlo) and the Metropolis-Hastings algorithm

    • @NormalizedNerd
      @NormalizedNerd  Před 3 lety

      Great suggestion.

    • @nonconsensualopinion
      @nonconsensualopinion Před 3 lety +1

      @@NormalizedNerd I'm on the edge of subscribing. A video on MCMC would convince me to subscribe and never leave!

    • @NormalizedNerd
      @NormalizedNerd  Před 3 lety

      @@nonconsensualopinion 😂😂...Let's see what comes next

  • @jtanium
    @jtanium Před 2 lety +7

    I've been going through these videos and doing the calculations by hand to make sure I understand the math correctly. When I tried to calculate the left eigenvector ([0.218, 0.273, 0.509]) using the method described in the first video (Mark Chains Clearly Explained Part 1), I got a different result ([0.168, 0.273, 0.559]), and I'm wondering if I missed a step. Here's what I did: starting with [0 0 1] meaning it is sunny, pi0A = [0.0 0.3 0.7], pi1A = [0.12 0.27 0.61], pi2A = [0.168, 0.273, 0.559]. It's interesting the second element matches. If anyone might help me understand where I went wrong, I'd greatly appreciate it!

    • @sushobhitrathore2555
      @sushobhitrathore2555 Před 2 lety

      you must start the probability of states as [0 1 0].....you will get the same values

    • @angadbagga9166
      @angadbagga9166 Před rokem

      @@sushobhitrathore2555 - lol why 010 ..sunny is at last position. And even if we go through your way we don't get the right result. Result of pi2A as per you funda is [0.252, 0.272, 0.476]

  • @Slappydafrog_
    @Slappydafrog_ Před 4 měsíci

    Hello from the future! Thank you for sharing your insights!

  • @ITTOMC
    @ITTOMC Před rokem +1

    Simply wonderful. Keep up your excellent work. Really really well done!

  • @nad4153
    @nad4153 Před 2 lety +5

    nice video ! but what happen to P(Y) at 8:30 in the final formula, why does it disappear ?

    • @marius7005
      @marius7005 Před rokem +1

      We want to find the X_1, ..., X_n that gives us the maximum value. Note that P(Y) does not depend on the Xs and is therefore a constant. A constant does not change the Xs that give us the maximum.

  • @surajpandey86
    @surajpandey86 Před 3 lety +1

    I really like the visuals and your presentation with the content. Good work!

  • @ramchillarege1658
    @ramchillarege1658 Před rokem +1

    Very nice and clear. Thank you.

  • @VersatileAnthem
    @VersatileAnthem Před 2 lety

    superb explanations ! that shows how in depth your knowledge is !

  • @streamindigoflags8035
    @streamindigoflags8035 Před 2 lety

    Thanks a lot! These type of videos are amazing and helps me understand the concepts in good way that are there in the books. It boosts my interest to this area. It helps me a lot in doing my project! U make these kind of videos for almost all concepts!😍

  • @truongsonmai6915
    @truongsonmai6915 Před 3 lety

    Very helpful video as well as the rest of the Markov series. Wish you luck from Vietnam!

  • @barhum5765
    @barhum5765 Před rokem

    Bro you are a king

  • @dannymoore1530
    @dannymoore1530 Před 3 měsíci

    This was good! Thank you.

  • @Cookies4125
    @Cookies4125 Před 2 lety +2

    Thanks for the explanation! You went through the math of how to simplify \argmax P(X=X_1,\cdots,X_n\mid Y=Y_1,\cdots,Y_n) but how do you actually compute the argmax once you've done the simplification? There must be a better way than to brute force search through all combinations of values for X_1,\cdots,X_n right?

  • @mika-ellajarapa5646
    @mika-ellajarapa5646 Před 2 lety +1

    hello!!!! please explain conditional random field :( thank you

  • @varshachinnammal
    @varshachinnammal Před rokem

    Dear sir,
    Your explanation was very well explained and understandable. It is full of mathematics coming here matrix, probability and so. I'm from a science background without maths. I needed this for bioinformatics but it is difficult to compare with nitrogenous bases with these matrix and formulae. Will you explain it in simpler method? It would be very helpful sir 🥺

  • @stefankermer7782
    @stefankermer7782 Před 5 měsíci

    this is great - had to laugh a lot when you inroduced indices to the matrices haha

  • @xiaoruidu8603
    @xiaoruidu8603 Před 3 lety +3

    please keep updating~ you are doing an amazing job~~.

  • @islemsahli737
    @islemsahli737 Před 2 měsíci

    Thank you well explained

  • @keithmartinkinyua2067
    @keithmartinkinyua2067 Před rokem +1

    this video is good but it left me hanging. I was expecting you to calculate the probability at the end.

  • @Garrick645
    @Garrick645 Před 17 dny +1

    As soon as the emoji's left, the video went over my head

  • @asmaaziz2436
    @asmaaziz2436 Před 3 lety

    Explained so simply. Well done. Helped me a lot.

  • @aveenperera3128
    @aveenperera3128 Před 3 lety +1

    please do more examples for hidden markov chains

  • @tetamusha
    @tetamusha Před rokem

    I got a little confused with the two HMM videos. I thought the second video would solve the argmax expression presented at the end of the first one, but the algorithm that solves this expression is the Viterbi algorithm and not the Forward algorithm from the second video. Just a heads-up to those that got a little lost like me.

  • @tyrannicalguy7262
    @tyrannicalguy7262 Před rokem

    "Hello People From The Future!" that was very thoughtful

  • @lucasqwert1
    @lucasqwert1 Před 11 měsíci

    How do we calculate the stationary distribution of the first state? I watched your previous videos but still cant calculate it! Thanks for answering!

  • @user-te6bh6rg5i
    @user-te6bh6rg5i Před 11 měsíci

    HI.. thanks for wonderful videos on Markov Chain. I just want to know how do you define the probability of transition state and emission state? What to do about unknown probabilities of state?. Regards

  • @muqaddaszahranaqvi5585

    Finally, some video helped.

  • @shivkrishnajaiswal8394
    @shivkrishnajaiswal8394 Před 2 lety +1

    good one

  • @MB-rc8ie
    @MB-rc8ie Před 3 lety

    This is gold. Thanks for uploading. Very helpful

  • @JVenom_
    @JVenom_ Před rokem +4

    not me studying for my final in 6 hrs

  • @vocabularybytesbypriyankgo1558

    Great explanation, super helpful

  • @abdsmmeat7492
    @abdsmmeat7492 Před rokem

    We need more videos on Markov chains

  • @annanyatyagip
    @annanyatyagip Před 6 měsíci

    excellent video!

  • @Extra-jv5xr
    @Extra-jv5xr Před 5 měsíci

    Is this related to the viterbi algorithm? Could you make a video on that?

  • @user-mq4em1dq1q
    @user-mq4em1dq1q Před 3 lety +1

    best videos! hope to see more of the videos about markov chain! Thank you

  • @jasonhuang2270
    @jasonhuang2270 Před rokem

    Thank you so much!

  • @ishhanayy
    @ishhanayy Před 9 měsíci

    Very well explained ! 👏👏

  • @sudeshnasani3451
    @sudeshnasani3451 Před 2 lety

    Excellent explanation

  • @elixvx
    @elixvx Před 3 lety +1

    thank you for your videos! please continue the great work!

  • @antoninmartykan9004
    @antoninmartykan9004 Před 10 měsíci

    fantastic series! I am just wondering, is there a more efficient way to calculate the maximum than to try all possible permutations? Thanks!

  • @24CARLOKB
    @24CARLOKB Před 2 lety

    Great video! Just a little lost where you get the prob(sad or happy | weather), which I think are emission probabilities? Thanks!

  • @Sam-rz5hw
    @Sam-rz5hw Před 4 měsíci

    excellent explanation!

  • @siphamandlamazibuko1424
    @siphamandlamazibuko1424 Před 5 měsíci

    Can someone please explain where the P(Y) in the denominator went when the products were substituted in the Bayes theorem?

  • @rohanthakrar7599
    @rohanthakrar7599 Před rokem

    can you explain the calculation of P(X|Y), the last step of the video when you put inside the products of P(Y|X) and P(X|X). Where does the P(Y) in the denominator go? Thanks

    • @Periareion
      @Periareion Před rokem +1

      Howdy!
      I think of it like this: P(Y) is a constant, and doesn't affect which sequence X has the highest probability of occurring. In other words: since every term gets divided by P(Y), we can just ignore it.
      Perhaps he could've made that a little clearer in the video.
      Cheers!

  • @atifshakeel1982
    @atifshakeel1982 Před rokem

    HI excellent and very easy way of explaining the complex mathematical terms. Did you make these slides in PowerPoint ? or some other tool ? Regards,

  • @stephenchurchill3929
    @stephenchurchill3929 Před rokem

    Great Video, Thanks!

  • @rustemdevletov2176
    @rustemdevletov2176 Před 3 lety +1

    I don't understand how you got the values of stationary distribution. Trying to calculate myself and getting totally different numbers..

  • @acesovernines
    @acesovernines Před 9 měsíci

    Superb explanation

  • @tarazdxs
    @tarazdxs Před 3 měsíci

    Thanks for the video, is it possible to access the python code you did for this problem?

  • @BasilAltaie
    @BasilAltaie Před 3 lety

    Where can I find the software you are using for mathematical visualization? Please advice.

  • @abhivaryakumar3107
    @abhivaryakumar3107 Před rokem

    Amazing video
    I really needed that and thank you so so much for explaining everything so clearly.
    If I may have one request, could you please share the python code??
    Thank you so so much

  • @minjun87
    @minjun87 Před rokem

    Thanks for the great video. What would be the hidden state sequence data and observed sequence data in speech to text use case?

  • @ameyapawar7097
    @ameyapawar7097 Před 2 lety

    Very well explained!

  • @mediwise2474
    @mediwise2474 Před rokem

    Sir pl.make a video on finite element method

  • @bassami74
    @bassami74 Před 2 lety

    Thank you for sharing, could you please explain how to implement HMM on measuring the earnings quality. Need your help 🙏🙏

  • @nakjoonim
    @nakjoonim Před 3 lety

    You deserve my subscription thanks a lot!

  • @Gamer2002
    @Gamer2002 Před rokem

    You should make more videos, you are awesome

  • @aadi.p4159
    @aadi.p4159 Před 8 měsíci

    ssuper super good. awesome work bro

  • @ThePablo505
    @ThePablo505 Před rokem

    something that looks so hard, you make it understandable even for a 4 years old kid

  • @SousanTarahomi-vh2jp
    @SousanTarahomi-vh2jp Před 4 měsíci

    Thanks!

  • @festusboakye3698
    @festusboakye3698 Před 2 lety

    Thank you so much for making this video.
    Could you please extend HMM to treeHMM using Forward algorithm to make inferences.

  • @meriembaha6887
    @meriembaha6887 Před 2 lety

    Thank you

  • @jonathanisai9286
    @jonathanisai9286 Před 3 lety +1

    Amazing video bro! you rock!

  • @Paranorman2049
    @Paranorman2049 Před měsícem

    One question though: Why is the probability of the observed state, P(Y), negligible?

  • @mayank9986
    @mayank9986 Před 11 měsíci

    Excellent

  • @nchsrimannarayanaiyengar8003

    very nice explanation thank you

  • @karannchew2534
    @karannchew2534 Před 2 lety

    Hi Normalised Nerd, and Everyone Reading This, I have a question please. Can I use HMM the other way round: find the mood sequence given the weather sequence? How?

  • @vivekbhat720
    @vivekbhat720 Před 3 lety +1

    I've some doubts
    since markov chain is hidden, we won't know translation matrix right?
    We'll only know emission matrix
    P(Yi | Xi) can be found out by emission matrix
    how to find P( Xi | Xi-1) ?

    • @NormalizedNerd
      @NormalizedNerd  Před 3 lety +5

      That's a great question! Let me clarify it. In this video I've assumed that our HMM is already trained i.e. we know both transition (T) and emission (E) matrix. T & E are the parameters of our model that are learnt during the training. In the future I shall make a video about training HMMs.