193 - What is XGBoost and is it really better than Random Forest and Deep Learning?

Sdílet
Vložit
  • čas přidán 7. 07. 2024
  • Code generated in the video can be downloaded from here:
    github.com/bnsreenu/python_fo...
    Dataset used in the video: archive.ics.uci.edu/ml/datase...)
    XGBoost documentation:
    xgboost.readthedocs.io/en/lat...
    Video by original author: • Kaggle Winning Solutio...
  • Věda a technologie

Komentáře • 59

  • @pvrnaaz
    @pvrnaaz Před 2 lety +7

    Very organized and clear with excellent examples that make it so easy to understand. Thank you!

  • @caiyu538
    @caiyu538 Před 2 lety +1

    Perfect tutorial, I am using XGBoost and random forest to analyze some work. Perfect tutorials for me. Always appreciate your continuous efforts to share your knowledge through youtube.

  • @sudippandit6676
    @sudippandit6676 Před 3 lety +1

    Very organized and straight forward! Waiting other videos. Thank you for sharing this knowledge.

  • @ashift32
    @ashift32 Před 2 lety +2

    Very well explained, clear and concise. Thanks for taking your time

  • @omeremhan
    @omeremhan Před rokem

    Magnificant!!! Thanks for clear explanation Sir.

  • @VarunKumar-pz5si
    @VarunKumar-pz5si Před 3 lety +1

    Awesome Tutorial, Glad I got a great teacher..Thank you...

  • @venkatesanr9455
    @venkatesanr9455 Před 3 lety

    Thanks, Sreeni sir for your valuable and knowledgeable content. Also, waiting for the next semantic segmentation series and also discusses the hyperparameters and their tuning, Time series analysis that will be highly helpful.

  • @evazhong4419
    @evazhong4419 Před 2 lety

    your explanation is so interesting haha, it helps me a lot understand the material

  • @mhh5002
    @mhh5002 Před 2 lety +3

    Very well explained, sir. It was intuitive for beginners. The analogies are interesting as well.

  • @evyatarcoco
    @evyatarcoco Před 3 lety

    A very useful episode, thanks sir

  • @drforest
    @drforest Před rokem

    Awesome comparison. Super thanks

  • @riti_joshi
    @riti_joshi Před 2 měsíci

    I never comment on any CZcams videos, but I am compelled to do here, because I learned most of my analyses for my dissertation following your tutorials. You're such a great tutor. Thank you so much.

  • @semon00
    @semon00 Před 3 měsíci

    Wow your explanotion is awesome!!!
    Dont stop plz

  • @sathishchetla3986
    @sathishchetla3986 Před 10 měsíci

    Thank you so much for your explanation sir

  • @vikramsandu6054
    @vikramsandu6054 Před 3 lety

    Well explained. Thank you so much for the video.

  • @mouraleog
    @mouraleog Před rokem

    Awesome video, thank you ! Greetings from Brazil!

  • @SP-cg9fu
    @SP-cg9fu Před rokem

    very useful video ! Thank you!

  • @RealThrillMedia
    @RealThrillMedia Před rokem

    Very helpful thank you!

  • @tannyakumar284
    @tannyakumar284 Před 2 lety

    Hi. I have a 1500x11 dataset and I am trying to see which out of cognitive ability, non-cognitive ability, family structure, parental involvement, and school characteristics predict academic performance (measured in terms of grades ranging from 1-5). Should I be using XG Boost for this problem or random forest? Thanks!

  • @axe863
    @axe863 Před 7 měsíci +1

    XGBoost with Regularized Rotations and Synthetic Feature Construction can approximate Deep NN deepness

  • @Ahmetkumas
    @Ahmetkumas Před 3 lety +2

    Thanks for the video and effort. Can you make a time series video using xgboost, or something with multiple features(lags, rolling mean, etc..)

  • @rezaniazi4352
    @rezaniazi4352 Před 2 lety +1

    thanks for the video
    what we have to change if we want to use XGBRegressor() insted if classifier ?
    xgboost documantation is so confusing !

  • @andyn6053
    @andyn6053 Před 9 měsíci +1

    This was very clear and useful! Do u have any link to your code? Also, could xgboost be used for linear regression aswell?

  • @abderrahmaneherbadji5478

    thank you very much

  • @farhaddavaripour4619
    @farhaddavaripour4619 Před 2 lety

    Thanks for the video. Something I noticed in the figure above that you might have missed is that in the figure you show the most evolved species has lighter hair than less evolved which could interpret a false impression that species with lighter hair are more evolved. It would be great if you could adjust the figure.

  • @Bwaaz
    @Bwaaz Před rokem

    very clear thanks :)

  • @kakaliroy4747
    @kakaliroy4747 Před 2 lety +1

    The example of bagging is so funny and I can fully relate

  • @sbaet
    @sbaet Před 3 lety +2

    can you make a quick video on normalization and standardizetion for a image dataset

  • @grantsmith3653
    @grantsmith3653 Před rokem +2

    Sreeni said we need to normalize, but I always thought we didn't need to do that with trees... Am I confused on something?
    Thanks for the video!!

  • @longtruong9935
    @longtruong9935 Před 2 lety

    dataset in the UCI link not avaiable now. could any one can provide update link?

  • @kangajohn
    @kangajohn Před 3 lety

    if your explanations were a kaggle competition it would be top 1%

  • @barrelroller8650
    @barrelroller8650 Před rokem +1

    It's not clear where did you get a dataset in a CSV format - the .zip archive from provided link includes only `wdbc.data` and `wdbc.names` files

  • @Lodeken
    @Lodeken Před 11 měsíci

    Wow that analogy! 😂 Amazingly apt lol!

  • @khairulfahim
    @khairulfahim Před rokem

    Where can I get the exact .csv file?

  • @kangxinwang3886
    @kangxinwang3886 Před 3 lety +1

    loved the arrange marriage example! Made it very intuitive and easy to understand. Thank you!

  • @multiversityx
    @multiversityx Před rokem

    What’s the method name? When are you presenting at NeurIPS? (I’ll be attending it :)

  • @darioooc
    @darioooc Před rokem

    Great!

  • @ghafiqe
    @ghafiqe Před rokem

    Perfect

  • @ramakrishnabhupathi4995

    Good one

  • @ahmedraafat8769
    @ahmedraafat8769 Před 2 lety +1

    The dataset has been removed from the website. is it possible to upload it?

    • @DigitalSreeni
      @DigitalSreeni  Před 2 lety +2

      Just google search for the keywords and you'll find it somewhere, may be on Kaggle. I do not own the data so I cannot share it, legally.

  • @Frittenfinger
    @Frittenfinger Před 8 měsíci

    Nice T-Shirt 😃

  • @v1hana350
    @v1hana350 Před 2 lety

    I have a question about the Xgboost algorithm. The question is how parallelization works in the Xgboost algorithm and explain me with an example.

  • @vzinko
    @vzinko Před rokem +13

    Another case of data leakage. You can't scale X and then split it into test and train. The scaling needs to happen after the split.

    • @Beowulf245
      @Beowulf245 Před 10 měsíci +1

      Thank you. At least someone understands.

    • @andyn6053
      @andyn6053 Před 9 měsíci

      So this video is incorrect?

  • @3DComputing
    @3DComputing Před 2 lety

    You're worth more money

  • @andromeda1534
    @andromeda1534 Před 3 lety

    Looks like when you demo-ed random forest, you didn't comment out the xgb line, so you actually showed the fitting for xgb twice with the same results.

  • @alejandrovillalobos1678
    @alejandrovillalobos1678 Před 3 lety +1

    can you talk about transformers please?

  • @user.................
    @user................. Před 21 dnem

    bro trying to share about life n forgot wts hes teaching 🤣🤣🤣🤣
    only were i gt complete idea about xgboost tq

  • @agsantiago22
    @agsantiago22 Před 2 lety

    Merci !