Principle Component Analysis (PCA) | Part 1 | Geometric Intuition

SdĂ­let
VloĆŸit
  • čas pƙidĂĄn 23. 06. 2024
  • This video focuses on providing a clear geometric intuition behind PCA. Learn the basics and set the foundation for understanding how PCA works in simplifying and preserving important information in your data.
    ============================
    Do you want to learn from me?
    Check my affordable mentorship program at : learnwith.campusx.in/s/store
    ============================
    đŸ“± Grow with us:
    CampusX' LinkedIn: / campusx-official
    CampusX on Instagram for daily tips: / campusx.official
    My LinkedIn: / nitish-singh-03412789
    Discord: / discord
    E-mail us at support@campusx.in
    ⌚Time Stamps⌚
    00:00 - Intro
    00:44 - What is PCA
    05:16 - Benefits of using PCA
    07:33 - Geometric Intuition
    25:01 - What is variance and why is it important?

Komentáƙe • 85

  • @aienthu2071
    @aienthu2071 Pƙed rokem +52

    So grateful for the videos that you make. I have burnt my pockets, spent hours on various courses just for the sake of effective learning. But most of the times I end up coming back at campusx videos. Thank you so much.

  • @henrystevens3993
    @henrystevens3993 Pƙed 2 lety +29

    Unbelievable...nobody taught me PCA like this.... Sir 5/5 for your teachings 🙏🙏 god bless you ❀

    • @vikramraipure6366
      @vikramraipure6366 Pƙed rokem +1

      I am interested with you for group study, reply me bro

  • @prasadagalave9762
    @prasadagalave9762 Pƙed 3 měsĂ­ci +7

    04:15 PCA is a feature extraction technique that reduces the curse of dimensionality in a dataset.
    08:30 PCA is a technique that transforms higher dimensional data to a lower dimensional data while preserving its essence.
    12:45 Feature selection involves choosing the most important features for predicting the output
    17:00 Feature selection is based on the spread of data on different axes
    21:15 PCA is a feature extraction technique that creates new features and selects a subset of them.
    25:30 PCA finds new coordinate axes to maximize variance
    29:45 Variance is a good measure to differentiate the spread between two data sets.
    33:54 Variance is important in PCA to maintain the relationship between data points when reducing dimensions.

  • @SumanPokhrel0
    @SumanPokhrel0 Pƙed rokem +5

    It's like im watching a NF series , at first you're introduced to different terms, methods their usecases and in the last 10 mins of the video everything adds up and you realize what ahd why these stratigies are in use. Amazing.

  • @rafibasha1840
    @rafibasha1840 Pƙed 2 lety +20

    You have done good research on every topic bro ,nice explanation ..I am so happy I found this channel at the same time feeling bad for not finding it earlier

  • @makhawar8423
    @makhawar8423 Pƙed rokem +2

    Can't have better understanding of PCA than this..Saved so much time and energy..Thanks a lot

  • @krishnakanthmacherla4431
    @krishnakanthmacherla4431 Pƙed 2 lety +4

    Wow , i regret why I did not get to this channel, very clear as a story , i can explain a 6 year old and make him/her understand â€ïžđŸ‘

  • @geetikagupta5
    @geetikagupta5 Pƙed rokem +7

    I am loving this channel more and more everytime I see a video here.The way content is presented and created is really awesome.Keep Inspiring and motivating us.I am learning a lot here.

  • @prankurjoshi9672
    @prankurjoshi9672 Pƙed rokem

    No words to express how precious your teaching is....

  • @fahadmehfooz6970
    @fahadmehfooz6970 Pƙed rokem

    Never have I seen a better explanation of PCA than this!

  • @eshandkt
    @eshandkt Pƙed 2 lety +2

    one of the finest explanation of pca I have ever seen Thankyou Sir!

  • @sidindian1982
    @sidindian1982 Pƙed rokem

    Sir , the way you explained the Curse of Dimensionality & its Solutions in Previous vedio -- Just mind blowing ..... YOU ARE GOD

  • @kirtanmevada6141
    @kirtanmevada6141 Pƙed 6 měsĂ­ci +1

    Totally an Awesome playlist for learning Data Science/Mining or for ML. Thank you so much sir! Means a lot!!

  • @avinashpant9860
    @avinashpant9860 Pƙed rokem +1

    Awesome explanation and best part is how he drops important info in between the topic, like such a good interpretation of scatter plot is in this video which i wouldn't find even in dedicated scatter plot video. So perfect.

  • @balrajprajesh6473
    @balrajprajesh6473 Pƙed rokem

    I thank God for blessing me with this teacher.

  • @zaedgtr6910
    @zaedgtr6910 Pƙed 8 měsĂ­ci

    Amazing explanation....NO one can explain pca as easily as you have done. Better than IIT professors.

  • @nawarajbhujel8266
    @nawarajbhujel8266 Pƙed 5 měsĂ­ci

    Top have this level of teaching, one should have deep level of understanding both from theoritcal as well as practical aspects. You have proved it again. Thank for providing such valuable teaching.

  • @shubhankarsharma2221
    @shubhankarsharma2221 Pƙed rokem

    Very nicely explained topics. One of the best teacher on ML.

  • @rahulkumarram5237
    @rahulkumarram5237 Pƙed rokem +1

    Beautifully explained !!! Probably the best analogy one could come up with. Thank you, sir.

  • @siddiqkawser2153
    @siddiqkawser2153 Pƙed 4 dny

    U rock dude! Really appreciate that

  • @raghavsinghal22
    @raghavsinghal22 Pƙed 2 lety +1

    Best Video for PCA. I'll definitely recommend to my friends 🙂

  • @DataScienceSchools
    @DataScienceSchools Pƙed 2 lety

    Wow, how simply you did it.

  • @motivatigyan6417
    @motivatigyan6417 Pƙed rokem

    U r outstanding for me sir...i can't able to understand untill i watch your video

  • @narmadaa2106
    @narmadaa2106 Pƙed 8 měsĂ­ci

    Excellent sir
    I have listened to different video lectures on PCA,
    But i didn't understand it properly.
    But your's is the best one.
    Thank you so much

  • @ritesh_b
    @ritesh_b Pƙed rokem

    thanks for the great explaination please keep explaining in this way only

  • @pankajbhatt8315
    @pankajbhatt8315 Pƙed 2 lety

    Amazing explanation!!

  • @SameerAli-nm8xn
    @SameerAli-nm8xn Pƙed rokem +2

    First of all the playlists is amazing you have done a really good job in explaining the concepts and intrusions behind the algorithms, I was wondering could you create a separate playlist for ARIMA SARIMAX and LSTM algorithms i really want to see those above algorithms in future class

  • @harsh2014
    @harsh2014 Pƙed rokem

    Thanks for the explanations!

  • @jawadali1753
    @jawadali1753 Pƙed 2 lety +2

    your teaching style is amazing , you are gem

  • @ytg6663
    @ytg6663 Pƙed 3 lety

    à€Źà€čà„à€€ à€žà„à€‚à€Šà€° à€čà„ˆđŸ‘đŸ‘đŸ™â€ïžđŸ”„

  • @qaiserali6773
    @qaiserali6773 Pƙed 2 lety

    Great content!!!

  • @ParthivShah
    @ParthivShah Pƙed 3 měsĂ­ci +1

    Thank You Sir.

  • @jiteshsingh6030
    @jiteshsingh6030 Pƙed 2 lety

    Just Wow đŸ”„ 😍

  • @user-nv9fk2jg5m
    @user-nv9fk2jg5m Pƙed 7 měsĂ­ci

    You are so good in this, i m like 'tbse kha thae aap'

  • @DeathBlade007
    @DeathBlade007 Pƙed rokem

    Amazing Explanation

  • @jazz5314
    @jazz5314 Pƙed rokem

    Wowww!!!! Best video

  • @beautyisinmind2163
    @beautyisinmind2163 Pƙed rokem

    Damn, you are the Messiah in ML teaching

  • @yashjain6372
    @yashjain6372 Pƙed rokem

    best explanation

  • @mukeshkumaryadav350
    @mukeshkumaryadav350 Pƙed rokem

    amazing explanation

  • @armanmehdikazmi5390
    @armanmehdikazmi5390 Pƙed 6 měsĂ­ci

    hats off to you sirrrr

  • @aadirawat4230
    @aadirawat4230 Pƙed 2 lety

    Such an underrated channel for ML.

  • @msgupta07
    @msgupta07 Pƙed 2 lety

    Amazing explanation... Can you share this one note for windows 10 notes of this entire series "100 days of Machine Learning"

  • @VIP-ol6so
    @VIP-ol6so Pƙed 2 měsĂ­ci

    great example

  • @rafibasha4145
    @rafibasha4145 Pƙed 2 lety +1

    Hi Bro,please make videos on feature selection techniques

  • @761rishabh
    @761rishabh Pƙed 3 lety +1

    Nice Presentation sir

  • @ambarkumar7805
    @ambarkumar7805 Pƙed rokem

    what is the difference between feature extraction and feature contruction as both are reducing the no of features?

  • @1234manasm
    @1234manasm Pƙed rokem

    Very nice explanation my i know which hardware you use to write on the notepad?

  • @sahilkirti1234
    @sahilkirti1234 Pƙed 3 měsĂ­ci

    you are the god

  • @MARTIN-101
    @MARTIN-101 Pƙed 11 měsĂ­ci +1

    sir you have no idea, how much you are helping data learners like me. thanks a lot. how can i help you. is there any where i can pay to you as a token of appreciation ?

  • @mustafachenine7942
    @mustafachenine7942 Pƙed 2 lety

    Is it possible to have an example of pictures to classify them into two categories?
    If the dimensions are reduced in pca and classification in knn is better , please

  • @kindaeasy9797
    @kindaeasy9797 Pƙed 5 měsĂ­ci

    but agar PCA ke geometric intuition mai mai clockwise ghumau axis ko toh variance toh rooms ka kam ho jaega na , or agar mai same process kru by taking washroomn on x axis and rooms on y tab toh washroom select ho jaega na ??

  • @beb57swatimohapatra21
    @beb57swatimohapatra21 Pƙed 9 měsĂ­ci

    Best course for ML

  • @arpitchampuriya9535
    @arpitchampuriya9535 Pƙed rokem

    Excellent

  • @arshad1781
    @arshad1781 Pƙed 3 lety

    thanks

  • @amanrajdas4540
    @amanrajdas4540 Pƙed rokem +1

    sir your videos are really amazing, I had learned a lot from your videos. But I have a doubt in feature construction and feature extraction. They both are looking similar. So can you please ,tell me the one major difference between these two.

  • @We.shall.fly1
    @We.shall.fly1 Pƙed 5 měsĂ­ci

    I have a doubt, If a variable is in range 0 to 1 and another variable is in range 0 to 1000(will have more variance / spread ). Why choosing 2nd variable just by looking at variance make sense? It may be matter of units like in km and cm. For this problem we use scaling. Am I right?

  • @pkr1kajdjasdljskjdjsadjlaskdja
    @pkr1kajdjasdljskjdjsadjlaskdja Pƙed 5 měsĂ­ci

    bhai ye video viral kiyu nahi ho raha hai ..thank you sir ❀

  • @AyushPatel
    @AyushPatel Pƙed 3 lety +1

    sir i just wanted to ask that can we write our own machine learning algorithms instead of using sklearn and tensorflow i mean from scratch plz make a video about that. I have been following you whole series. Sir do reply. Thanks to your efforts

    • @ytg6663
      @ytg6663 Pƙed 3 lety

      Ha Likhsaktr ho yaar...
      Yes you can...

    • @vikramraipure6366
      @vikramraipure6366 Pƙed rokem

      I am interested with you for group study, reply me bro

    • @AyushPatel
      @AyushPatel Pƙed rokem

      @@vikramraipure6366 actually currently i am working on some other project so.. i am sorry..
      thanks for the proposal!

    • @yashwanthyash1382
      @yashwanthyash1382 Pƙed rokem

      My suggestion is use sklearn library for existed algorithms. If that doesn't work create your own algorithm.

  • @sachinahankari
    @sachinahankari Pƙed 3 měsĂ­ci

    Variance of grocery shop is greater than number of rooms but you have shown reverse..

  • @surajghogare8931
    @surajghogare8931 Pƙed 2 lety

    Cleaver explaination

  • @x2diaries506
    @x2diaries506 Pƙed rokem

    Dear sir I am confused about the variance formula and your interpretation. Kindly recheck.

  • @shivendrarajput4413
    @shivendrarajput4413 Pƙed 22 dny

    that is what we do

  • @ayushtulsyan4695
    @ayushtulsyan4695 Pƙed rokem

    Bhai ek playlist dedo for statistical application in Data Science

  • @namansethi1767
    @namansethi1767 Pƙed 2 lety

    Thanks you sir

    • @vikramraipure6366
      @vikramraipure6366 Pƙed rokem

      I am interested with you for group study, reply me bro

    • @namansethi1767
      @namansethi1767 Pƙed rokem

      Done....Give me your mobile no.
      ..... I will call i when I free

  • @0Fallen0
    @0Fallen0 Pƙed rokem

    24:24 Aha! So PCA finds an alternate co-ordiante system and uses the change of basis matrix to transform the data.

  • @learnfromIITguy
    @learnfromIITguy Pƙed rokem

    solid

  • @devnayyar9536
    @devnayyar9536 Pƙed rokem

    Sir notes milege app ki

  • @Star-xk5jp
    @Star-xk5jp Pƙed 5 měsĂ­ci

    Day3:
    Date:11/1/24

  • @vatsalshingala3225
    @vatsalshingala3225 Pƙed rokem

    ❀❀❀❀❀❀❀❀❀❀❀❀❀❀

  • @murumathi4307
    @murumathi4307 Pƙed 2 lety

    This work same SVM? đŸ€”

  • @sahilkirti1234
    @sahilkirti1234 Pƙed 3 měsĂ­ci

    04:15 PCA is a feature extraction technique that reduces the curse of dimensionality in a dataset.
    08:30 PCA is a technique that transforms higher dimensional data to a lower dimensional data while preserving its essence.
    12:45 Feature selection involves choosing the most important features for predicting the output
    17:00 Feature selection is based on the spread of data on different axes
    21:15 PCA is a feature extraction technique that creates new features and selects a subset of them.
    25:30 PCA finds new coordinate axes to maximize variance
    29:45 Variance is a good measure to differentiate the spread between two data sets.
    33:54 Variance is important in PCA to maintain the relationship between data points when reducing dimensions.
    Crafted by Merlin AI.