Live Day 4- Discussing Decision Tree And Ensemble Machine Learning Algorithms

Sdílet
Vložit
  • čas přidán 3. 02. 2022
  • Join the community session ineuron.ai/course/Mega-Community . Here All the materials will be uploaded.
    Playlist: • Live Day 1- Introducti...
    The Oneneuron Lifetime subscription has been extended.
    In Oneneuron platform you will be able to get 100+ courses(Monthly atleast 20 courses will be added based on your demand)
    Features of the course
    1. You can raise any course demand.(Fulfilled within 45-60 days)
    2. You can access innovation lab from ineuron.
    3. You can use our incubation based on your ideas
    4. Live session coming soon(Mostly till Feb)
    Use Coupon code KRISH10 for addition 10% discount.
    And Many More.....
    Enroll Now
    OneNeuron Link: one-neuron.ineuron.ai/
    Direct call to our Team incase of any queries
    8788503778
    6260726925
    9538303385
    866003424

Komentáře • 100

  • @MrSurya-hn7pu
    @MrSurya-hn7pu Před 4 měsíci +6

    1:14:47, Krish u r right, Gini can't be greater than 0.5 but that is in case of binary classification only. Over here we can clearly see in that node, there are 3 output classes with each having frequency 50, so if we calculate Gini index we'll get 1 - (1/9+1/9+1/9) that is clearly 0.667. So its right over there.
    So basically the generalized formula of maximum gini impurity value of a classification problem is 1/n, where n is the number of classes in the output feature. Hope this helps

  • @svitlanatuchyna7154
    @svitlanatuchyna7154 Před 26 dny +1

    Great tutorial and very good explanation! Highly recommend to anyone who is trying to understand Decision Tree Algorithm. Helped me a lot! Thank you very much Krish!

  • @balasubramanian7756
    @balasubramanian7756 Před 2 lety +1

    Krish...u r doing a great job for all of us.thanks a lot...and keep continuing ..

  • @edwardmansal8459
    @edwardmansal8459 Před 2 lety +2

    Great lecture. Simplified to the understanding of a beginner. Remain blessed.

  • @Lucia-el6ex
    @Lucia-el6ex Před rokem +3

    Gracias por ser un excelente profesor!

  • @arunbaist6386
    @arunbaist6386 Před 2 lety +25

    Great explaination Krish ,coming from a non-technical background i could easily understand the session and keep posting such videos, thank you

  • @ckpratibha9956
    @ckpratibha9956 Před rokem +2

    Very informative sessions, have never heard such an explanation on CZcams before. Thanks a lot

  • @ashilshah3376
    @ashilshah3376 Před 8 měsíci +1

    Sir, your lectures are even useful after 1 year, thank you for this :)

  • @pollypravir5378
    @pollypravir5378 Před 2 lety

    Thank you for the session.

  • @sidindian1982
    @sidindian1982 Před rokem

    32:57 ,. Entroy session & GI .. excellent explaination sir .. just loved it 🙂

  • @siddhantjain452
    @siddhantjain452 Před 2 lety +1

    Amazing lecture

  • @kyeiwilliam8463
    @kyeiwilliam8463 Před 3 měsíci

    Thanks for the insightful teaching, Sir Krish

  • @malvis8737
    @malvis8737 Před 2 lety

    excellent proffesor !!

  • @paneercheeseparatha
    @paneercheeseparatha Před 11 měsíci +3

    Information gain explained so simply. I love this guy. You should put "buy me a coffee" like link in description so that people who are willing to donate can easily do it.

  • @sahithkannayyagari7652

    Hi krish, you are awesome!🌟

  • @azharafridi9619
    @azharafridi9619 Před měsícem +1

    very interesting sir the whole video i enjoyed it alot... thanks for the struggle you have put over there.... God bless you

  • @shahidulislamzahid
    @shahidulislamzahid Před 7 měsíci

    This was one of the best Live sessions.

  • @anuragthakur5787
    @anuragthakur5787 Před 2 lety +19

    Great session sir
    Please don't worry about the live viewers count
    Some of us catch up later

  • @priyankachore1617
    @priyankachore1617 Před 2 lety +1

    Thankkk you so much sir!

  • @sindhuorigins
    @sindhuorigins Před rokem +1

    Some more clarity on how the splits are being made for the decision tree regression would help, if you can make a separate video on that

  • @raghavsharma6430
    @raghavsharma6430 Před 2 lety +2

    As usual you rocked the session Krish!!!!

  • @amitjena1556
    @amitjena1556 Před 9 měsíci

    Thank you.

  • @rohitjagtap5228
    @rohitjagtap5228 Před 2 lety

    well explained

  • @arunnandam2272
    @arunnandam2272 Před rokem

    Great Session

  • @BEC-SowmyaS
    @BEC-SowmyaS Před 5 měsíci

    Wonderful session no one can beat it

  • @MathCorpsVN
    @MathCorpsVN Před rokem

    Very nice! Great teaching, useful lectures. Thank you so much.

  • @mdjibran5983
    @mdjibran5983 Před 2 lety +17

    Krish, Please from tomorrow. If possible, Disable the chat during live session as it might affecting your concentration and due to which the people who really come here to learn and understand the complex equations in an easy way suffering a lot. So please disable or not open the chat on other screen so you can see them continuously.

  • @muhammadarham8240
    @muhammadarham8240 Před 2 lety +5

    Krish, I would say only one thing and which is that YOU ARE AMAZING TEACHER!!!

  • @adarshraj-ox1xr
    @adarshraj-ox1xr Před 7 měsíci

    nice lecture

  • @saurabharbal2684
    @saurabharbal2684 Před rokem

    Hello, Krish sir,
    Your teaching is too awesome.
    Can you please type in here what you told at the start when your mice was of.
    Please it will help me.

  • @sandipansarkar9211
    @sandipansarkar9211 Před 2 lety +1

    finished watching

  • @chaitanyabhardwaj9909
    @chaitanyabhardwaj9909 Před 2 lety

    you're the man

  • @ankan54
    @ankan54 Před 2 lety +2

    hi Krish, in the practical example, is the reason for Gini being .667, the fact that there are 3 classes and all of them have equal number of samples?

  • @AlphaMainly
    @AlphaMainly Před 10 měsíci

    1:15:20 , reason for gini greater than 0. 5 : here we have more than two classes to predict , its a multiclass classification , so gini impurity formula would be 1 - p1^2 - p2^2 - p3^2 - p4^2 , so its possible to have a gini value greater then 0.5

  • @rafibasha1840
    @rafibasha1840 Před 2 lety +1

    Please explain when to use Gini or Entropy .

  • @m-bk4um
    @m-bk4um Před 7 měsíci +1

    great

  • @AmitKumar-ym8uz
    @AmitKumar-ym8uz Před rokem

    Excellent explanation still i have some doubt about formula. How to derive Gini impurity and enthalpy formula?
    This question asked in one interview with me. I searched a lot but I did not get this answer.

  • @mayur36
    @mayur36 Před 2 lety +1

    I came here to look how we select root node in decision Tree Regressor but it seems you have just used one feature. So my question is which feature is selected first and if we have multiple columns ? and depending on What?

  • @jahidmdhasan1021
    @jahidmdhasan1021 Před 11 měsíci

    Hi sir , I have a doubt , Using Gini Impurity over Entropy will help us in faster execution , but can we Use Gini Impurity for Information to select a feature to split . If Yes , what will be the formula , Will it be 2*GI in place of H(S) and H(Sv)

  • @vijayabaskaranvenkatraman2389

    sir your teacch method i like do it wwell.

  • @rafibasha1840
    @rafibasha1840 Před 2 lety +3

    Please make videos on Polynomial regression ,Transformaitons Krish …Also one week session for in depth dANN ,CNN,RNN ,GAN,NLP,Time series ,MLOPS

    • @ankitbiswas8380
      @ankitbiswas8380 Před 2 lety

      he already has a playlist for deep learning . check in his channel's playlist section .

    • @rafibasha1840
      @rafibasha1840 Před 2 lety

      @@ankitbiswas8380 ,I am asking for live session

    • @sh__--
      @sh__-- Před rokem

      Concepts are same online(live) and offline (playlist) teaching...🙄

  • @hari8568
    @hari8568 Před 2 lety +1

    At 29:15,log base 2 of zero is undefined right?

  • @Nancivlog
    @Nancivlog Před 2 lety

    Can we apply fuzzy in decision tree.. if so how

  • @kunallokhande4397
    @kunallokhande4397 Před 2 lety +2

    It's not theoretical
    It's mathematical
    And i love MatheMatics

  • @sayanchakraborty2483
    @sayanchakraborty2483 Před 2 lety

    The playlist link is for advanced statistics, not for ML live sessions

  • @bansaljayti9706
    @bansaljayti9706 Před rokem

    for categorical features how to split in dt regressor

  • @rajashekharpotoori1668
    @rajashekharpotoori1668 Před 2 lety +10

    Please make 7 days classes on eda and feature scaling in future

  • @bankoleayoade8852
    @bankoleayoade8852 Před rokem

    Pls how do I get all the materials used in this tutorials

  • @saravanankarunamurthy9854

    If I'm getting entropy as 0.4 , 0.56 like that, whether it is pure split or impure split.. most of the time we will not get either 0 or 1 ..

  • @nancysrivastava2558
    @nancysrivastava2558 Před 2 lety

    yes

  • @parthbagda1303
    @parthbagda1303 Před 6 měsíci

    why do we select that features which has a high information gain?

  • @manasagiduthuri6557
    @manasagiduthuri6557 Před rokem

    could anyone help me out . how can i find out these materials and code

  • @abhi9029
    @abhi9029 Před 2 lety

    I have a question, why entropy is introduced in first place. I mean if we can use gini for large number of features then we can also use gini for small number of features right?

  • @Nancivlog
    @Nancivlog Před 2 lety

    How Decision tree is different from.fuzzy?

  • @priyanshityagi3126
    @priyanshityagi3126 Před rokem

    Are these session enough to understand ml??

  • @akarkabkarim
    @akarkabkarim Před rokem

    rich content

  • @okikannamdi9218
    @okikannamdi9218 Před rokem

    From the practical implementation you didn't show how to infuse entropy or gini impurity.

    • @okikannamdi9218
      @okikannamdi9218 Před rokem +1

      Like is there No mathematical python script explaining how entropy or gini impurity works in this algorithm.

  • @skvali3810
    @skvali3810 Před rokem +2

    Can anyone answer my question in Decision Tree regression there we will calc the MSE value (y^-y) here what is the Y value and what is the Y^ value if we consider F1 as my root node

    • @abhimanyukspillai6572
      @abhimanyukspillai6572 Před rokem

      I believe the y^ value is the Mean value and y value is the original value in the dataset. Please correct me if I am wrong

    • @sagark1431
      @sagark1431 Před rokem

      'y^' is the predicted value and 'y' is the actual value

    • @skvali3810
      @skvali3810 Před rokem +1

      @@sagark1431 For construct a Decision Tree we dont have any predicated values na. we build a tree based on our dataset in our dataset we can have Y value not Y^ value. after construct a Decision Tree we can get model predicted value there we calculate the MSE na

    • @sagark1431
      @sagark1431 Před rokem

      @@skvali3810 czcams.com/users/clipUgkxYmYUbLQ2ZjBP4sPhk_azjY63vhBe_RcE see if this helps.

    • @skvali3810
      @skvali3810 Před rokem

      @@abhimanyukspillai6572 mean of what

  • @maths_impact
    @maths_impact Před rokem

    👍👍👍

  • @oo_anonymous8672
    @oo_anonymous8672 Před 2 lety

    sir gini can be more than 0.5 /..............we'll look at a dataset with many different labels
    lots_of_mixing = [['Apple'],
    ['Orange'],
    ['Grape'],
    ['Grapefruit'],
    ['Blueberry']]
    # This will return 0.8
    gini(lots_of_mixing) ##output=0.8

    • @ankan54
      @ankan54 Před 2 lety

      I think, it is coming 0.8 because all the classes here have equal number of samples. correct me if I'm wrong pls

    • @himanshusingh-bh5kh
      @himanshusingh-bh5kh Před rokem

      Yep You are Right I Tried On 7 Category gini Value Turned out to be 0.84 . So More No of Output Labels More Number Of Gini Value Even Surpassing it range 0.5 which defies formula logic I Think That's the Limitation It is Very Good against Binary & may be 3 classes but at root node .

    • @shivasurya1928
      @shivasurya1928 Před rokem

      gini max = 1-(1/n) ; where 'n' is the no of unique objects in target

  • @rosemarydara1025
    @rosemarydara1025 Před rokem

    no audio for this video????

  • @Good-tz8ep
    @Good-tz8ep Před rokem

    💕❤️❤️💕

  • @hetvigandhi6705
    @hetvigandhi6705 Před 2 lety

    where is day 5

  • @ankitbiswas8380
    @ankitbiswas8380 Před 2 lety +1

    the regressor part should have been clearly explained . That part was poorly explained .

  • @siddheshdhanawade3709
    @siddheshdhanawade3709 Před 10 měsíci

    notes ?

  • @visitorsvlog
    @visitorsvlog Před 2 lety

    Hi

  • @aradhyakanth8409
    @aradhyakanth8409 Před 2 lety

    Break 1st : 47:26

  • @aradhyakanth8409
    @aradhyakanth8409 Před 2 lety

    Break 2nd 58:41

  • @rayanwaheibkhalifa1299

    I can't hear you

  • @sachvaidya
    @sachvaidya Před rokem +1

    you need to add few more practice examples. Decision tree section seems hurried towards the end. ** I am not a spammer

  • @kulbhushansingh1101
    @kulbhushansingh1101 Před rokem

    with audio 2:54

  • @subhajitdutta1443
    @subhajitdutta1443 Před 2 lety

    To whom it may concern,
    If probability(P) =0,
    Then Gini Entropy becomes = 1,
    as per the formula.. Then why it always ranges from 0 to 0.5?
    Thank you,
    Subhajit

    • @nikhilshukla3188
      @nikhilshukla3188 Před 2 lety +1

      Gini Entropy formula = 1- summation(1 to n)(P)^2 - which includes both P(Yes) and P(No) labels for ex, Gini never becomes 1 even when P =0 , as the square function reduces the probablity even further.
      For further information,
      www.bogotobogo.com/python/scikit-learn/scikt_machine_learning_Decision_Tree_Learning_Informatioin_Gain_IG_Impurity_Entropy_Gini_Classification_Error.php

  • @satyajitpaul339
    @satyajitpaul339 Před 2 lety

    Where will we find the recordings of last 3 days sessions

  • @blueplanet3967
    @blueplanet3967 Před 2 lety +1

    The Gini max value of 0.5 applicable to this example or for all problems? czcams.com/video/dGNJ-feQLC4/video.html

  • @ganps87269
    @ganps87269 Před 2 lety

    ignore spammers like missing value thing. spammers are like less than 0.5% drop them and ignore them while ur live session chats 🙏 If u dnt care they will not post anything later ! Thanks bro for ur live sessions. We wanted u to continue it.

  • @mylife7810
    @mylife7810 Před 2 lety

    Sir, please learn video editing skills

  • @abnyk92
    @abnyk92 Před 7 měsíci

    😂

  • @chitrranshia7765
    @chitrranshia7765 Před 2 lety +2

    This was one of the best Live sessions.