193 - What is XGBoost and is it really better than Random Forest and Deep Learning?
Vložit
- čas přidán 7. 07. 2024
- Code generated in the video can be downloaded from here:
github.com/bnsreenu/python_fo...
Dataset used in the video: archive.ics.uci.edu/ml/datase...)
XGBoost documentation:
xgboost.readthedocs.io/en/lat...
Video by original author: • Kaggle Winning Solutio... - Věda a technologie
Very organized and clear with excellent examples that make it so easy to understand. Thank you!
Perfect tutorial, I am using XGBoost and random forest to analyze some work. Perfect tutorials for me. Always appreciate your continuous efforts to share your knowledge through youtube.
Very organized and straight forward! Waiting other videos. Thank you for sharing this knowledge.
Very well explained, clear and concise. Thanks for taking your time
You're very welcome!
Magnificant!!! Thanks for clear explanation Sir.
Awesome Tutorial, Glad I got a great teacher..Thank you...
Thanks, Sreeni sir for your valuable and knowledgeable content. Also, waiting for the next semantic segmentation series and also discusses the hyperparameters and their tuning, Time series analysis that will be highly helpful.
your explanation is so interesting haha, it helps me a lot understand the material
Very well explained, sir. It was intuitive for beginners. The analogies are interesting as well.
Glad to hear that
A very useful episode, thanks sir
Awesome comparison. Super thanks
I never comment on any CZcams videos, but I am compelled to do here, because I learned most of my analyses for my dissertation following your tutorials. You're such a great tutor. Thank you so much.
Wow, thank you!
Wow your explanotion is awesome!!!
Dont stop plz
Thank you so much for your explanation sir
Well explained. Thank you so much for the video.
Glad you liked it
Awesome video, thank you ! Greetings from Brazil!
Thanks for watching!
very useful video ! Thank you!
Very helpful thank you!
Hi. I have a 1500x11 dataset and I am trying to see which out of cognitive ability, non-cognitive ability, family structure, parental involvement, and school characteristics predict academic performance (measured in terms of grades ranging from 1-5). Should I be using XG Boost for this problem or random forest? Thanks!
XGBoost with Regularized Rotations and Synthetic Feature Construction can approximate Deep NN deepness
Thanks for the video and effort. Can you make a time series video using xgboost, or something with multiple features(lags, rolling mean, etc..)
thanks for the video
what we have to change if we want to use XGBRegressor() insted if classifier ?
xgboost documantation is so confusing !
This was very clear and useful! Do u have any link to your code? Also, could xgboost be used for linear regression aswell?
thank you very much
Thanks for the video. Something I noticed in the figure above that you might have missed is that in the figure you show the most evolved species has lighter hair than less evolved which could interpret a false impression that species with lighter hair are more evolved. It would be great if you could adjust the figure.
very clear thanks :)
The example of bagging is so funny and I can fully relate
can you make a quick video on normalization and standardizetion for a image dataset
Sreeni said we need to normalize, but I always thought we didn't need to do that with trees... Am I confused on something?
Thanks for the video!!
dataset in the UCI link not avaiable now. could any one can provide update link?
if your explanations were a kaggle competition it would be top 1%
It's not clear where did you get a dataset in a CSV format - the .zip archive from provided link includes only `wdbc.data` and `wdbc.names` files
Wow that analogy! 😂 Amazingly apt lol!
Where can I get the exact .csv file?
loved the arrange marriage example! Made it very intuitive and easy to understand. Thank you!
What’s the method name? When are you presenting at NeurIPS? (I’ll be attending it :)
Great!
Perfect
Good one
Thank you! Cheers!
The dataset has been removed from the website. is it possible to upload it?
Just google search for the keywords and you'll find it somewhere, may be on Kaggle. I do not own the data so I cannot share it, legally.
Nice T-Shirt 😃
I have a question about the Xgboost algorithm. The question is how parallelization works in the Xgboost algorithm and explain me with an example.
Another case of data leakage. You can't scale X and then split it into test and train. The scaling needs to happen after the split.
Thank you. At least someone understands.
So this video is incorrect?
You're worth more money
I am priceless :)
Looks like when you demo-ed random forest, you didn't comment out the xgb line, so you actually showed the fitting for xgb twice with the same results.
can you talk about transformers please?
bro trying to share about life n forgot wts hes teaching 🤣🤣🤣🤣
only were i gt complete idea about xgboost tq
Merci !
Thank you very much.