Machine Learning 3.2 - Linear Discriminant Analysis (LDA) and Quadratic Discriminant Analysis (QDA)

Sdílet
Vložit
  • čas přidán 1. 06. 2024
  • We will cover classification models in which we estimate the probability distributions for the classes. We can then compute the likelihood of each class for a new observation, and then assign the new observation to the class with the greatest likelihood. These maximum likelihood methods, such as the LDA and QDA methods you will see in this section, are often the best methods to use on data whose classes are well-approximated by standard probability distributions.
    This material complements pp. 138-149 of An Introduction to Statistical Learning (faculty.marshall.usc.edu/garet....

Komentáře • 25

  • @lizzy1138
    @lizzy1138 Před 3 lety +4

    Thanks for this! I needed to clarify these methods in particular, was reading about them in ISLR

  • @neftalisalazar2352
    @neftalisalazar2352 Před 3 měsíci +1

    I enjoyed watching your video, thank you. I will watch more of your videos on machine learning videos thank you!

  • @huilinchang8027
    @huilinchang8027 Před 3 lety +8

    Awesome lecture, thank you professor!

  • @Spiegeldondi
    @Spiegeldondi Před rokem +1

    A very good and concise explanation, even starting with the explanation of likelihood. Very well done!

  • @vi5hnupradeep
    @vi5hnupradeep Před 3 lety +4

    Thankyou so much ! Cleared a lot of my doubts

  • @spencerantoniomarlen-starr3069

    10:48 ohhhhh, I was just going back and forth between the sections on LDA and QDA in three different textbooks (An Introduction to Statistical Learning, Applied Predictive Analytics, and Elements of Statistical Learning) for well over an hour and that multivariate normal pdf was really throwing me off big time. Mostly because of the capital sigma to the negative 1st power term, I didn't realize it was literally a capital sigma, I kept thinking it was a summation of something!

  • @Dhdhhhjjjssuxhe
    @Dhdhhhjjjssuxhe Před rokem +2

    Good job. It is very easy to follow and understand

  • @ofal4535
    @ofal4535 Před rokem +2

    i was trying to read it my self but you made it so much simpler

  • @geo123473
    @geo123473 Před 7 měsíci +1

    Very great video! Thank you professor!! :)

  • @JappieYow
    @JappieYow Před 3 lety +6

    Interesting and clear explanation! Thank you very much, this will help me in writing my thesis!

  • @zhengcao6529
    @zhengcao6529 Před 3 lety +1

    You are so great. Keep up please.

  • @jaafarelouakhchachi6170
    @jaafarelouakhchachi6170 Před měsícem +1

    can you share these slides in the videos with me?

  • @user-mw6vi2te3s
    @user-mw6vi2te3s Před 2 lety +2

    Very useful information, thanks you professor!

  • @kaym2332
    @kaym2332 Před 3 lety +1

    Hi! If the classes are assumed to be normally distributed, does that subsume that the features making up an observations are normally distributed as well?

    • @billbasener8784
      @billbasener8784  Před 3 lety +1

      Yes. If the each class has a multivariate normal distribution than each individual feature variable ihas a single variable normal distribution.

  • @MrRynRules
    @MrRynRules Před 2 lety +1

    Thank you sir, well explained.

  • @pol4624
    @pol4624 Před 2 lety

    very good video, thank you professor

    • @billbasener8784
      @billbasener8784  Před 2 lety

      I am glad it is helpful. Thank you for the kind words!

  • @saunokchakrabarty8384

    How do you get the values of 0.15 and 0.02? I'm getting different values.

    • @rmharp
      @rmharp Před 8 měsíci

      Agreed. I got approximately 0.18 and 0.003, respectively.

  • @haitaoxu3468
    @haitaoxu3468 Před 3 lety

    could you share the slide?