Curse of Dimensionality Easily explained| Machine Learning

Sdílet
Vložit
  • čas přidán 9. 06. 2019
  • The curse of dimensionality refers to various phenomena that arise when analyzing and organizing data in high-dimensional spaces that do not occur in low-dimensional settings such as the three-dimensional physical space of everyday experience.
    #CurseofDimensionality
    Please subscribe the channel for more interesting content.
    You can buy my book on Finance with ML and DL from amazon.in
    url: www.amazon.in/Hands-Python-Fi...

Komentáře • 35

  • @manujkumarjoshi9342
    @manujkumarjoshi9342 Před 3 lety +28

    I have a perfect example.....my instructor gave me a 2.5-hour lecture on this finally m confused but in 7 min video you made it clear. It's a Curse of Dimensionality.........great.

  • @chaitanyamallepudi3531
    @chaitanyamallepudi3531 Před 3 lety +3

    Could we select the features using the l2 regularization coefficients; which helps us to select the right features which are not shrinked? By that can we reduce the curse of dimensionality?

  • @akashgayakwad9550
    @akashgayakwad9550 Před 4 lety +5

    How do u we know what is threshold value of features selection?

  • @ga43ga54
    @ga43ga54 Před 5 lety +5

    Please make a video on the math behind t-SNE.... Great video!! Thank you

  • @sandipansarkar9211
    @sandipansarkar9211 Před 3 lety +1

    Watched it for second time for better understanding and coding practice. Thanks

  • @HARSHRAJ-2023
    @HARSHRAJ-2023 Před 5 lety

    Eagerly waiting for your next video. Please upload soon.

  • @cmbharathi2064
    @cmbharathi2064 Před 4 lety

    simply explained.. thank you Krish

  • @MarcelloNesca
    @MarcelloNesca Před 4 lety

    This was a great video explained very easily!

  • @vijaynale7893
    @vijaynale7893 Před 5 lety

    Thanks you much bro.. waiting for your next video

  • @dholearihant6011
    @dholearihant6011 Před 4 lety

    sir thanx for the video i am new to the scenario of machine learning and this helped me

  • @eneskosar.r
    @eneskosar.r Před 4 lety

    Very well explanation. Easy to understand.

  • @Albertrose.24
    @Albertrose.24 Před 2 lety

    Thanks for explaining clearly sir..

  • @sandipansarkar9211
    @sandipansarkar9211 Před 3 lety +1

    Great explanation Krish. No need to make notes .Just understand. Thanks

  • @discoverdevops5368
    @discoverdevops5368 Před 4 lety +2

    In a very simple term, if you are working with large number of dimensions the pattern discover is challenging, this is what the curse of dimensionality.

  • @senadredzic8835
    @senadredzic8835 Před 3 lety

    Well explained! Thanks

  • @subho2859
    @subho2859 Před 3 lety +1

    Is it necessary that the curse of dimensionality happens when the no of features increased exponentially ?

  • @salihsartepe2614
    @salihsartepe2614 Před 8 měsíci

    Thank you for this simple explanation :)

  • @louerleseigneur4532
    @louerleseigneur4532 Před 3 lety

    Thanks Krish

  • @jamalnuman
    @jamalnuman Před 4 měsíci

    Great

  • @manjunath.c2944
    @manjunath.c2944 Před 4 lety

    kindly do video on Chunking and Lazy Learners method

  • @somubd
    @somubd Před 4 lety

    Thank You

  • @anithaani4672
    @anithaani4672 Před 2 lety

    Thank you bro

  • @jayasimhayenumaladoddi1602

    Can you please make a very on OLPP?

  • @redreaper8652
    @redreaper8652 Před 7 měsíci

    Basically diminished returns?

  • @hsin-yuku4086
    @hsin-yuku4086 Před 4 lety

    What does accuracy here mean? Does it mean the ability for the model to predict?

  • @skyman7290
    @skyman7290 Před 5 lety +7

    Thanks for the effort. But what you explained is not curse of dimensionality it is simply increasing the model parameters which leads to overfitting. Whereas curse of dimensionality talks about high-dimensional data which their distance distribution gets independent of the data.

    • @krishnaik06
      @krishnaik06  Před 5 lety +1

      Hello my dear friend. I think you got confused between overfitting, underfitting and curse of dimensionality. Overfitting and underfitting usually happens when you dont select the right hyperparameter for the machine learning algorithm that we are using. Here we are discussing about attributes,features. Today i will also be uploading a video on overfitting and underfitting.
      Thanks
      Krish

    • @skyman7290
      @skyman7290 Před 5 lety +2

      @@krishnaik06 No I am not confused. When you have more features it means your models needs more parameters therefore you increase the complexity of your model this leads to over fitting when you have few training data. The solution is either regularizing the parameters or reducing the dimension. (off course increasing training data helps). Curse of dimensionality is a different subject please at least see the wikipidia page : en.wikipedia.org/wiki/Curse_of_dimensionality

    • @adityachandra2462
      @adityachandra2462 Před 4 lety +2

      @@skyman7290 the predictive power of a classifier or regressor first increases as the number of dimensions or features used is increased but then decreases,[4] which is known as Hughes phenomenon[5] or peaking phenomena........from your given link I have got this, Kindly check the article first before commenting on Krish's video. Peace!!

    • @subho2859
      @subho2859 Před 3 lety

      @@krishnaik06 is it only that the right hyperparameter is responsible for underfitting and overfitting not selecting the right features are responsible

  • @masalaaa3
    @masalaaa3 Před rokem +1

    One concern: When you are using the word exponentially, you are using it wrongly. The exponential increase required for ensuring reliable modeling is in the sampling size and not the number of features. Thank you for the good illustration though. Time: 5:55

  • @subratapaul061325
    @subratapaul061325 Před 11 měsíci

    I have a perfect example.....my instructor gave me a 2.5-hour lecture on this finally m confused but in 7 min video you made it clear. It's a Curse of Dimensionality.........great.