Lecture 20- Linear Discriminant Analysis ( LDA) (with Solved Example)

Sdílet
Vložit
  • čas přidán 5. 09. 2024
  • This video is about Linear Discriminant Analysis.
    If you are interested in building cool Natural Language Processing (NLP) Apps , access our NLP APIs at www.firstlangu... . Also for NLP product development and consultation, please reach out to us at info@firstlanguage.in

Komentáře • 83

  • @bhavikdudhrejiya4447
    @bhavikdudhrejiya4447 Před 4 lety +34

    I searched a lot for LDA example but finally, I found the mathematics behind the calculation behind.
    Very useful video. Thank you maam. Appreciate your hard work

  • @ssitakiran
    @ssitakiran Před 2 lety +7

    Thank you Madam. This is plain old school teaching at it's best. No fuss, just imparting knowledge !

  • @aayushpurohit8126
    @aayushpurohit8126 Před 3 lety +40

    You have calculated the Sw^-1 correctly later but used the wrong values to find the vector.

    • @parvathykrishna.r536
      @parvathykrishna.r536 Před 8 měsíci

      Yes😂

    • @deepdas594
      @deepdas594 Před 4 měsíci

      @@parvathykrishna.r536 we wasted 20 minutes of our life give a disclaimer next time

  • @gulabpreetsingh9985
    @gulabpreetsingh9985 Před rokem +3

    Hello Maam, at 9:17 HOW you calculated the value [-0.91 -0.39] ^T. would you elobrate this maam

  • @eimajin8
    @eimajin8 Před 4 lety +1

    So far the best LDA explanation I have found!

  • @orchidchetiaphukan4658
    @orchidchetiaphukan4658 Před 4 lety +5

    Finally found a detailed explanation of LDA. Thank you Maam.

  • @marieeriksson7909
    @marieeriksson7909 Před 2 lety +1

    This is the best video I have found on the subject. Great work!

  • @tracyyi4478
    @tracyyi4478 Před 4 lety +5

    Yes, is wrong. very confusing.
    Can you explain why the inverse you used before and after is different?

  • @shivanikothakapu1640
    @shivanikothakapu1640 Před rokem +1

    In Sw inverse you have substituted the another value mam, if we try we are getting different answer

  • @deepav7951
    @deepav7951 Před 3 lety +2

    The explanation was very detailed and nice mam. In calculation of s1, the fourth element should be 2.37. It is written as 2.69. I think it was a calculation mistake. Please check.

  • @mohamednajiaboo9817
    @mohamednajiaboo9817 Před 4 lety +5

    Really helpful. Thanks. but the inverse Invert of SW is not matching when we calculate it. But the concept is clear

  • @shivamkolhe3242
    @shivamkolhe3242 Před 3 lety

    Mam it was a Very beautiful step by step explanation. Finally understood the steps

  • @PrajaktaKharwandikar
    @PrajaktaKharwandikar Před rokem

    best lectures on ML Thank u so much maam for u r efforts

  • @mithunshaha4342
    @mithunshaha4342 Před rokem

    very Nice Explanation,Mam.Needs More tutorial From You.

  • @RookHMA
    @RookHMA Před rokem

    Oh wow, I actually get something for once, thank you so much!

  • @faheemrajwadkar7100
    @faheemrajwadkar7100 Před 2 lety

    Best math explanation on LDA. Thank you !!

  • @vaibhavchhaya9145
    @vaibhavchhaya9145 Před 4 lety +1

    Thank you ma'am! Loved the patient step by step approach. I just can't relate the notions of maximizing/minimizing an expression and finding the eigenvector. For example in LDA we are trying to maximize the between group variance and minimizing the within group variance, then how does that relate to finding the eigenvector to the quotient of the two covariance matrices?

  • @sruthisrinivasan3622
    @sruthisrinivasan3622 Před 3 lety

    Clearly explained mam! Very easy to understand the concept! Thank you so much

  • @newgenanand
    @newgenanand Před 4 lety +4

    from 8.19 i get confused.. to my understanding (Sw)^(-1) calculated in the note part.. so we need to calculate that value along with SB matrix for getting v1 and v2.... in another method also you are not mentioning (Sw)^(-1) value again changing the diagonal and sign values... please do explain clearly mam

  • @chintu8700
    @chintu8700 Před 3 lety

    excellent explanation madam...i am cbit hyderabad ece faculty thank you madam

  • @Mrussninja
    @Mrussninja Před 3 lety

    a way better explanation than my profs does it!

  • @amberhayat5796
    @amberhayat5796 Před 3 lety +1

    for computing SB ,u have written (m1-m2) *(m1-m2)transpose but u have done (m1-m2)T *(m1-m2) which is wrong.please explain

  • @parthpatwari3174
    @parthpatwari3174 Před 3 měsíci

    You deserve more

  • @krishnageethakaruppasamy8493

    WoW! Good Explanation. Thank you.

  • @niharthakkar4384
    @niharthakkar4384 Před 3 lety +1

    Best and the simplest explanation of LDA

  • @097-madhu3
    @097-madhu3 Před rokem

    Thankyou mam.. I have a small query, Please could you mention the X-axis and Y-axis.

  • @Annasupari
    @Annasupari Před rokem +1

    Very informative video madam. Can you please suggest any book having all the mathematical intuition behind the ML algorithms?

  • @thynguyenmai6133
    @thynguyenmai6133 Před 2 lety

    thank you so much maam, I finally understand it

  • @ccuuttww
    @ccuuttww Před 4 lety

    when u calculate the covariance matrix it should be divide by n-k k which is no of classes
    2 in this case but i am not sure about this part

  • @manoharnookala4212
    @manoharnookala4212 Před 2 lety

    Thank you mam, it is absolutely great session. But a small request, if you explain same thing by taking attributes it would be great for us

  • @nehasoni6235
    @nehasoni6235 Před 4 lety +5

    Nice explanation madam. Thank you so much. The inverse written at 8:59 and used before this time is wrong. After wards its correct. Please check.

    • @tracyyi4478
      @tracyyi4478 Před 4 lety

      Yes, is wrong. very confusing.
      Can you explain why the inverse you used before and after is different?

  • @masterleague8349
    @masterleague8349 Před rokem

    At 7:00 , the ų1-ų2 is incorrect, it should be [-5.4 1]

  • @Mr_ST_720
    @Mr_ST_720 Před 2 lety

    Thanks from Maharashtra

  • @Manishkumar-mw2zw
    @Manishkumar-mw2zw Před rokem

    amazing explanation

  • @zuberpatel1312
    @zuberpatel1312 Před 4 lety +2

    fisher discriminant analysis and Linear discriminant analysis are same?

  • @parthpatwari3174
    @parthpatwari3174 Před 3 měsíci

    7:35 how did you made that matrix? I don't get it 😭, can someone plz explain quickly, I have exam tomorrow 😱😱

  • @amitpanwarer
    @amitpanwarer Před 3 lety

    Nice explanation. Can you suggest any text book for pattern Recognition having good amount of numerical

  • @siddharthdedhia11
    @siddharthdedhia11 Před 3 lety

    Maam Your covariance matrix for class 1 is wrong. Please recheck the calculation.Because when we calculate covariance we divide the value by n-1 not n

  • @ajaymaharjan8440
    @ajaymaharjan8440 Před 4 lety

    Hello Maam,
    could it be possible to explain further LDA to extend to liner polynomial Kernel trick with the same numerical example

  • @swethasays
    @swethasays Před 2 lety

    Thank you very much!

  • @khoavuang7714
    @khoavuang7714 Před rokem

    Dear Sir and everyone, I have a question that need help now. Could you show me how to calculate MSE in LDA ?? Thank you

  • @azaralizadeh3492
    @azaralizadeh3492 Před 4 lety +1

    Thank you so much for lecture. it was clear and easy to understand

  • @rahulss3063
    @rahulss3063 Před 10 měsíci

    mam you have amistake in inverse formula

  • @prof.meenav1550
    @prof.meenav1550 Před 2 lety

    Excellant video mam

  • @sangeethak6861
    @sangeethak6861 Před rokem

    Which textbook you are using. Kindly tell

  • @user-qf6pd3pe2n
    @user-qf6pd3pe2n Před 9 měsíci

    Helpful ❤❤

  • @mamatadas846
    @mamatadas846 Před 3 lety

    Thank you so much ma'am

  • @milanjain6295
    @milanjain6295 Před 2 lety

    Thank you so much

  • @Kunuswavik
    @Kunuswavik Před rokem

    thank you.

  • @shrinuraj9023
    @shrinuraj9023 Před 3 lety

    thank you so much mam.

  • @sachinkotikal5627
    @sachinkotikal5627 Před 4 lety

    Mam... how to find 3 class feature???
    Use 3 mean values!!!!

  • @SKStation10
    @SKStation10 Před 9 měsíci

    Why your Sw^-1 is different in the final formula and computation?

  • @ShubhangiKrishnan
    @ShubhangiKrishnan Před rokem

    Can you please link a PDF of your notes in this video?

  • @alonfu5214
    @alonfu5214 Před 2 lety

    thanks alot madam!!!!!!

  • @NazaninYari
    @NazaninYari Před 4 lety

    This video was extremely useful. Thank you so much!

  • @jaroslavprucha9198
    @jaroslavprucha9198 Před 2 lety

    S is not covariance but rather scatter

  • @sharkk2979
    @sharkk2979 Před 4 lety

    nice explation mam luv you

  • @nguyenphong4065
    @nguyenphong4065 Před 4 lety

    thanks you !

  • @sajidahmed1330
    @sajidahmed1330 Před 4 lety +1

    Dear ma'am,
    At 5:55 you are dividing each of the entries by 5 (total number of samples under the class) after summing values from the 5 matrices(for the 5 individual samples), but this division step is not present in the formula at 4:56 which only shows the multiplication and the summation parts? Is this a typo or am I missing out something here? Thanks in advance.

    • @newgenanand
      @newgenanand Před 4 lety +1

      1/N should come bro in that formula part...

  • @samyukthamobile8447
    @samyukthamobile8447 Před 4 lety

    Understood complete problem. Thanks

  • @buarikazeem4156
    @buarikazeem4156 Před 2 lety

    Hello Dr. Subalalitha,
    After following one of your videos on LDA analysis,
    1. is it okay to use the global mean of the matrix while calculating the individual class covariance or, we should use the group mean (just as you did in your video
    2. How do i draw the dimension reduction graph (y=W^T * X)

  • @kelixoderamirez
    @kelixoderamirez Před 4 lety

    permission to learn mam

  • @souravl6882
    @souravl6882 Před 4 lety

    Ty aunty

  • @MUHAMMADIRFAN-nz1gl
    @MUHAMMADIRFAN-nz1gl Před 4 lety

    madam matrix S2 is not correct

    • @subalalithacn5827
      @subalalithacn5827  Před 4 lety +1

      thanks for informing but I cross checked it seems to be correct sir .

    • @tracyyi4478
      @tracyyi4478 Před 4 lety

      @@subalalithacn5827 is it okay now?

  • @simpleman1546
    @simpleman1546 Před 4 lety +2

    Too many things going wrong here. (Explanation is good, but solving the equations)

  • @souvikbaruah8185
    @souvikbaruah8185 Před 2 lety

    Terribly wrong calculations for God's sake!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

  • @adrenochromeaddict4232
    @adrenochromeaddict4232 Před 8 měsíci

    in S_1 2.6 should be 2.64

  • @neelmishra2320
    @neelmishra2320 Před rokem

    Hi, remember in academics it's always important to give appropriate credits, please mention that you took the examples from the IIT-kharagpur's ppt from the link www.facweb.iitkgp.ac.in/~sudeshna/courses/ml08/lda.pdf. Thanks

  • @satyamroy4727
    @satyamroy4727 Před 3 lety +3

    Hello Maam, at 9:17 HOW you calculated the value [-0.91 -0.39] ^T. would you elobrate this maam