Part 1-Decision Tree Classifier Indepth Intuition In Hindi| Krish Naik
Vložit
- čas přidán 5. 06. 2024
- The decision tree classifier creates the classification model by building a decision tree. Each node in the tree specifies a test on an attribute, each branch descending from that node corresponds to one of the possible values for that attribute.
github url : github.com/krishnaik06/Machin...
Timestamp
0:00:00 Introduction
0:01:01 Agenda
0:02:25 Basic Decision Tree
0:06:25 Examples Dataset
0:07:24 Decision Tree Construction
0:12:16 Pure And Impure Split
0:13:23 Entropy Intuition
0:21:27 Gini Impurity
0:25:14 Information Gain Intuition
Subscribe @krishnaik06 Channel For Data Science Videos In English.
channel link: bit.ly/3aeve4r
ML playlist in hindi: bit.ly/3NaEjJX
Stats Playlist In Hindi:bit.ly/3tw6k7d
Python Playlist In Hindi:bit.ly/3azScTI
Now I will be uplaoding videos of Data Science In Hindi.
#KrishNaik #decisiontreeclassifier #entropy #informationgain
Connect with me here:
Twitter: / krishnaik06
Facebook: / krishnaik06
instagram: / krishnaik06
We are happy to announce iNeuron is coming up with the 6 months Live Full Stack Data Analytics batch with job assistance and internship starting from 18th June 2022.The instructor of the course will be me and Sudhanshu. The course price is really affordable 4000rs Inr including GST.
The course content will be available for lifetime along with prerecorded videos.
You can check the course syllabus below
Course link: courses.ineuron.ai/Full-Stack-Data-Analytics
From my side you can avail addition 10% off by using Krish10 coupon code.
Don't miss this opportunity and grab it before it's too late. Happy Learning!!
wonderful explanation sir.... I'm already enrolled in Data Science with one of the edtech of India... no doubt waha ke teachers bhi accha padhate par jo english mei content hai wo mind mei ek baar mei acche se nhi jaata... ye content hindi wala raise ghus gya mind mei ki bus ab hamesha yaad rhega... Thankyou for your efforts..
Bohot achha explain krte ho Sir aap 👌🏻💯
मंडळ आभारी आहे
Thanx sir in Hindi explanations you tend to cover topics better(English vedios are also of far better quality than anyone else)
after one an era No one will beat you sir !! incredible explanation thankyou so much sir
Many, Many Thanks .....so lovely of you
Worth watching😍😍😍
It is one of the best and simplest explanations till far
wonderful explanations sir no one can explain like you...🙏🙏🙏
thankyou..sir😇
That was awesome
Thank you Krish for Crystal Clear Explanation.❤
You make everything look so easy
what an amazing tutorial...hats off sirji!!!...
Great explaination...hard to find anywhere else👌👌
hello krish sir... your explanation is easy to understand and anyone can learn easily..thank you sir...😊
Amazing ❤
wonderfully explained sir!
Awesome Explanation....Thanks A Lot....Keep It Up !!!
Thanks sir please continue this series
sir, I really find your videos very helpful. thanks a lot.
your teaching skill awesomwe.
Thankyou krish sir ........
great explaination sir
Wonderful explanation given by you sir in hindi.
Hello Krish sir ..thanku so much 🙏 for a very excellent explanation .
Great explanation
Very Good explained by you .it is lot help me Thank u very much
Very well explained sirjiiii
thank you sir..its to understand...
Wonderful
Thanks it is really helpful and easy to understand
Your legend deae sir thank u be happy 😍
nice tutorial
in one word bosssss
Nice explanation
In Entropy formula summation of p(x) * log2(p(x))
wow
As always Very well explained.
I have one query sir. You told that if the dataset is very big then use gini index otherwise entropy is fine. But finding the entropy is must for the information gain as no mention of Gini index in information gain formula. So is it possible to use gini index to find information gain?
Kindly throw light on that. 😊
There is a way to calculate the Information Gain using Gini index as well.
sir you said H(s) is the entropy of root node but i think it is the entropy of target attribute
Sir, can we find information gain using ginny impurity?
Sir sklearrn and seaborn ka video banaiye . thank u
In calculating information gain, can we use gini impurity instead of entropy?
Very nice explanation sir.i have one question how to get intership as no one is hiring for fresher
Sir aise hi video bnate rhiye apko shayd pta bhi nhi hogaa ki ye aapki kitni bdi help h DATA SCIENCE lovers ke liye.
Dil se dhanyvaad 🙏🙏🙏
❤❤❤❤❤❤❤❤❤❤
Sirrrr.... ❤ I have a question 🙋!
If interviewer ask a question why we are using minus ( - ) sign in Entropy? Please reply........ ❤
its formula
Don't worry they don't ask these types of mathematical formulas.
They can ask what is Gini impurity.
Sir acc to external sites gini impurity ranges from 0-1
Please confirm on this…
0*log0 is undefined how is it coming 0??
My answer of Entropy is coming 0.6 not 1
Hello sir
You r doing a great job
Video volume is very less. It is difficult to listen