1 Principal Component Analysis | PCA | Dimensionality Reduction in Machine Learning by Mahesh Huddar

Sdílet
Vložit
  • čas přidán 5. 09. 2024

Komentáře • 93

  • @kenway346
    @kenway346 Před rokem +118

    I noticed that your channel contains the entirety of Data Mining taught at the Master's level! Thank you very much, subscribing immediately!

  • @ishu_official
    @ishu_official Před 8 měsíci +22

    Super explanation.. today is my machine learning paper

  • @VDCreatures-kc6uf
    @VDCreatures-kc6uf Před 9 měsíci +6

    Super explanation..the best channel in CZcams to learn machine learning and ann topics ❤❤

    • @MaheshHuddar
      @MaheshHuddar  Před 9 měsíci +1

      Thank You
      Do like share and subscribe

  • @junaidahmad218
    @junaidahmad218 Před 8 měsíci +2

    This man has depth knowledge of this topic.

  • @TrueTalenta
    @TrueTalenta Před rokem +21

    Amazing step-by-step outline!
    I love it💌, so I subscribe!

  • @jambulingamlogababu8914
    @jambulingamlogababu8914 Před 10 měsíci +3

    Excellent Teaching. Salute to you sir

  • @venkateshwarlupurumala6283

    Very clear Explanation Sir.... Thank you so much...

  • @NandeeshBilagi
    @NandeeshBilagi Před 5 měsíci +1

    Clear and nice explanation. Thanks for the video

  • @rodrigorcbb
    @rodrigorcbb Před rokem +4

    Thanks for the video. Great explanation!

  • @radhay4291
    @radhay4291 Před 10 měsíci +1

    Thank u very much.Very clear explanation and it is to understand

  • @Dinesh-be8ys
    @Dinesh-be8ys Před rokem +3

    thank u for uploading like this video

  • @krishnachaitanya3089
    @krishnachaitanya3089 Před rokem +2

    Thats a clear explanation i have seen

  • @kapras711
    @kapras711 Před rokem +1

    super explanation .. very easy to understand with out any hook ups sir
    thanks ...Inspr KVV.Prasad

  • @user-qz4yu5lo1r
    @user-qz4yu5lo1r Před 8 měsíci +1

    tq sir for this wonderful concept

  • @ashishlal7212
    @ashishlal7212 Před 4 měsíci

    Thank you so much today is my data mining and ML paper

  • @thilagarajthangamuthu2935

    Thank you sir. Clear and easy to understand. Thank you.

  • @shubhangibaruah3940
    @shubhangibaruah3940 Před 8 měsíci +1

    thank you sir, you were amazing🤩

    • @MaheshHuddar
      @MaheshHuddar  Před 8 měsíci

      Welcome
      Please do like share and subscribe

  • @Husain567
    @Husain567 Před 15 dny

    Best explanation

  • @priyalmaheta690
    @priyalmaheta690 Před 5 měsíci

    content and teaching is very good please also provide the notes it will be helpful

  • @Blackoutfor10days
    @Blackoutfor10days Před 4 měsíci +2

    Can you add the concept of hidden Markov model in your machine learning playlist

  • @Straight_Forward615
    @Straight_Forward615 Před 5 měsíci

    thanks a lot for this wonderful lecture.

  • @yuva_india123
    @yuva_india123 Před rokem +1

    Thanks sir for your explanation 🎉

  • @yashtiwari4696
    @yashtiwari4696 Před rokem +3

    Sir please upload the content of ensemble methods bagging boosting and random forest

    • @MaheshHuddar
      @MaheshHuddar  Před rokem +3

      Ensemble Learning: czcams.com/video/eNyUfpGBLts/video.html
      Random Forest: czcams.com/video/kPq328mJNE0/video.html

  • @srinivas664
    @srinivas664 Před měsícem

    Nice presentation tq sir

  • @SaifMohamed-de8uo
    @SaifMohamed-de8uo Před 7 měsíci

    thank you so much you are great professor

    • @MaheshHuddar
      @MaheshHuddar  Před 7 měsíci +1

      You are very welcome
      Do like share and subscribe

  • @sinarezaei4288
    @sinarezaei4288 Před 6 měsíci

    Thank you very much master huddar❤

  • @priya1912
    @priya1912 Před 8 měsíci +1

    thank u so much

  • @RaviShankar-gm9of
    @RaviShankar-gm9of Před 7 měsíci +3

    Super Bhayya ...

  • @HamidAli-ff2zn
    @HamidAli-ff2zn Před rokem

    Thank you so much sir amazing explaination♥♥♥

  • @waidapapa1514
    @waidapapa1514 Před rokem +2

    Why we are not dealing with e2 means why we not do e2^T.[cov matrix]

    • @rohanshah8129
      @rohanshah8129 Před 11 měsíci +1

      Here, we had considered 2 dimension as the high dimensonal data for example.
      One of the most usecase of PCA is in dimensionality reduction.
      So, if you want you can use e2 and get second PC. But then think about it.
      From 2 variable, we again got 2 variables. That's why he has shown only PC1.
      However, in reality we generally use 2 PC axes (mostly depends on your data). If it has a lot of variables, then 3 or 4 can also be good but we don't generally go beyond that. So, in this case you will need e2, e3 and e4 as well. So this is how it works.

  • @putridisperindag6986
    @putridisperindag6986 Před 9 měsíci +1

    thank you very much Sir, for ur explantion on that video. I still confused so I would like to ask how to get the value of: [-4.3052, 3.7361, 5.6928, -5.1238] how can I get the value. I still dont get. Thank u Sir

    • @jvbrothers5454
      @jvbrothers5454 Před 8 měsíci

      yeahh im also confused how did he get im getting values diffrent 0.3761 5.6928 -5.128

  • @user-of6qi2hk1e
    @user-of6qi2hk1e Před 10 měsíci

    Thank you very much sir

  • @nikks9969
    @nikks9969 Před 6 měsíci

    Hello sir, thank you for your explanation.I have a doubt at 08:17 why you have considered only first equation?

    • @MaheshHuddar
      @MaheshHuddar  Před 6 měsíci

      You will get same answer with second equation
      You can use either first or second no issues

  • @RaviShankar-gm9of
    @RaviShankar-gm9of Před 7 měsíci

    linear discriminent analysis please make a video bhayya

  • @MadaraUchiha-wj8sl
    @MadaraUchiha-wj8sl Před 7 měsíci

    Thanks you,sir

  • @zafar151
    @zafar151 Před 6 měsíci

    Excellent

  • @mango-strawberry
    @mango-strawberry Před 3 měsíci

    thanks a lot

    • @MaheshHuddar
      @MaheshHuddar  Před 3 měsíci

      You are most welcome
      Do like share and subscribe

  • @Ateeq10
    @Ateeq10 Před 7 měsíci

    Thank you

  • @Abhilashaisgood
    @Abhilashaisgood Před 3 měsíci

    thankyou sirr, how to calculate 2nd pc?

    • @MaheshHuddar
      @MaheshHuddar  Před 3 měsíci

      Select the second eigen vector and multiply to the given feature matrix

  • @muhammadsaad3793
    @muhammadsaad3793 Před 9 měsíci

    Nice!

  • @advancedappliedandpuremath

    Sir book name please

  • @user-kn7sr8tg4q
    @user-kn7sr8tg4q Před rokem +2

    Hi Sir, Great explanation about PCA. But when I searched the covariance matrix for more 2 variables it's showing that covariance is only done between 2 variables.
    How to calculate the covariance if a dataset have more than 2 variables. Could you please give an explanation on that.....!!

    • @fintech1378
      @fintech1378 Před rokem +2

      you need to do for all pairwise combinations

    • @shahmirkhan1502
      @shahmirkhan1502 Před rokem +3

      @fintech1378 is right. You need to do pairwise combinations. For example, for 4 variables, your covariance matrix will be 4x4 with the following combinations:
      cov(a, a) cov (a, b) cov (a, c) cov(a,d)
      cov(b, a) cov(b, b) cov(b, c) cov(b, d)
      cov(c, a) cov (c, b) cov(c, c) cov(c, d)
      cov(d, a) cov(d, b) cov (d, c) cov(d, d)

    • @rohanshah8129
      @rohanshah8129 Před 11 měsíci +1

      If there are n variables, covariance matrix will be of nxn shape.

    • @parthibdey6005
      @parthibdey6005 Před 6 měsíci

      is this covariance for reducing 4 to 1@@shahmirkhan1502

  • @user-of4rz5cq1i
    @user-of4rz5cq1i Před rokem +1

    بحبككككككككككككككككككككككككككككككككككككككككك يا سوسو

  • @jameykeller5055
    @jameykeller5055 Před 8 měsíci

    devru sir neevu

  • @brucewayne.64
    @brucewayne.64 Před 9 měsíci

    Thanks Sir