Vanishing Gradient Problem in ANN | Exploding Gradient Problem | Code Example

Sdílet
Vložit
  • čas přidán 24. 07. 2024
  • Learn about the Vanishing and Exploding Gradient Problems in Artificial Neural Networks (ANNs) with practical code examples. Understand the challenges and solutions for training deep networks effectively. Improve your grasp on these important concepts in neural network training.
    Code - colab.research.google.com/dri...
    ============================
    Do you want to learn from me?
    Check my affordable mentorship program at : learnwith.campusx.in
    ============================
    📱 Grow with us:
    CampusX' LinkedIn: / campusx-official
    CampusX on Instagram for daily tips: / campusx.official
    My LinkedIn: / nitish-singh-03412789
    Discord: / discord
    👍If you find this video helpful, consider giving it a thumbs up and subscribing for more educational videos on data science!
    💭Share your thoughts, experiences, or questions in the comments below. I love hearing from you!
    ✨ Hashtags✨
    #GradientProblems #NeuralNetworks #codeexamples
    ⌚Time Stamps⌚
    00:00 - Intro
    01:11 - Vanishing Gradient Problem
    12:20 - Code Demo
    18:00 - How to handle VG problem
    20:17 - Code Demo
    23:18 - RELU Activation function
    25:27 - Code Demo
    27:28 - Weight Initialization Techniques
    28:11 - Batch Normalization
    28:34 - Residual Network
    31:51 - Outro

Komentáře • 60

  • @pravinshende.DataScientist
    @pravinshende.DataScientist Před 2 lety +15

    hi sir you are playing a big role to make me a good data scientist ... thank you vert much!!!

  • @amLife07
    @amLife07 Před 18 dny +1

    Thank you sir for another amazing lecture!! 😃

  • @MrChudhi
    @MrChudhi Před 2 lety +4

    Very nice talk. A lot of my questions are answered. Thanks a lot.

  • @rahilafiroz9995
    @rahilafiroz9995 Před 5 měsíci +1

    awesome explanation for each and every step. Thank you very much Sir..

  • @user-pd1gy8xh4y
    @user-pd1gy8xh4y Před 10 měsíci +2

    Excellent playlist on Deep learning.

  • @tarunkumardas8347
    @tarunkumardas8347 Před 8 měsíci +2

    Excellent explanation sir. !!!

  • @sanjaisrao484
    @sanjaisrao484 Před 7 měsíci +1

    very good explaination, Thanks

  • @zkhan2023
    @zkhan2023 Před 2 lety +2

    Sir best of best explanation and thanks

  • @ashishsinha5338
    @ashishsinha5338 Před 3 měsíci +1

    The best explantation so far

  • @sandhyabansal5264
    @sandhyabansal5264 Před 2 měsíci

    Great and mind opening videos

  • @savyasachi6988
    @savyasachi6988 Před měsícem

    so nice great way of teaching

  • @ParthivShah
    @ParthivShah Před 3 měsíci +2

    Thank You Sir.

  • @AkashBhandwalkar
    @AkashBhandwalkar Před 2 lety

    1st comment, thanks so much for taking your time and teaching us.

  • @nikhiliyer8436
    @nikhiliyer8436 Před rokem +6

    Please cover EXPLODING GRADIENT also with examples

  • @sayampalrecha3947
    @sayampalrecha3947 Před rokem +2

    great video 👏

  • @abhishekanand5898
    @abhishekanand5898 Před 2 lety +2

    Awesome ❤️🔥

  • @vanshshah6418
    @vanshshah6418 Před 2 lety +1

    Sir please upload next videos! I am waiting from last 10 days regularly checking your playlist.. please sir

  • @pratikmahadik3675
    @pratikmahadik3675 Před 2 lety +2

    Great🔥

  • @jayantsharma2267
    @jayantsharma2267 Před 2 lety +2

    great content

  • @rb4754
    @rb4754 Před měsícem

    Awesome

  • @rafibasha4145
    @rafibasha4145 Před 2 lety +9

    Hi ,Sigmoid derivative lies between 0 and .25

  • @aashishmalhotra
    @aashishmalhotra Před 2 lety +1

    Sir thank you

  • @kshitiztiwari9275
    @kshitiztiwari9275 Před 2 lety +1

    Bhai I have a confusion and it might be very trivial. But in Pipelines with Columntransformers whenever I am using OrdinalEncoding on certain columsn and then OHE on some other column...OHE fails. it gives an error like ValueError: could not convert string to float: 'AllPub'. This is Housing kaggle competition dataset. Is there something i am missing here. I am using ordinal encoder before OHE and providing categories in that. Anyone else can help too..

  • @pavangoyal6840
    @pavangoyal6840 Před 2 lety

    Thank you..

  • @barunkaushik7015
    @barunkaushik7015 Před rokem +2

    Superb

  • @rafibasha4145
    @rafibasha4145 Před rokem +2

    @11:50,pls let us kniw know when are you covering tensorboard and callbacks

  • @asmitpatel9746
    @asmitpatel9746 Před 2 lety +1

    👍👍

  • @aadilkhan9725
    @aadilkhan9725 Před 2 lety +7

    Sir after DL please make video on open cv tutorial using DL.

    • @riteshsbar
      @riteshsbar Před 5 měsíci

      yes please make video on open cv tutorial using DL.

  • @ManasNandMohan
    @ManasNandMohan Před měsícem

    Sir we are finding the difficult to crack the job in data science placement so please make a dedicated 100 days series of placement tricks & tips, Requested everyone to upvote it , so that sir notice it

  • @shahmuhammadraditrahaman9904

    ❤️

  • @StoryScapebyShan
    @StoryScapebyShan Před měsícem

    completed

  • @aakiffpanjwani1089
    @aakiffpanjwani1089 Před 3 měsíci

    how about we use the learning rate to avoid VGP? like we can use lr = 10 or 100 or 1000 so when it multiplies with the derivatives, the value is again higher, to counter the deep multiplication of 0.1s in a NN. correct me if im wrong
    thanks

  • @sripavani6328
    @sripavani6328 Před 4 měsíci

    Can you please make a video on resents

  • @znyd.
    @znyd. Před 2 měsíci

    🖤

  • @ajitkumarpatel2048
    @ajitkumarpatel2048 Před rokem

    🙏

  • @AkashBhandwalkar
    @AkashBhandwalkar Před 2 lety +5

    Sir please make a video on Hypothesis testing. Tried searching in your channel but did not find it. Love to learn it from you.
    If anyone else can help me find it.. Okay let me know and send the links in the thread.

  • @ali75988
    @ali75988 Před 8 měsíci

    31:42 badshaho gradient clipping bi rehta.

  • @swethamk7495
    @swethamk7495 Před 2 lety

    I am getting error in movie recommender system using ML sir plz help

  • @mr.deep.
    @mr.deep. Před 2 lety +2

    sir please upload next video

  • @bibhutibaibhavbora8770
    @bibhutibaibhavbora8770 Před 10 měsíci +1

    Sir please explain resnet topic.

  • @HUNTER-wp3mc
    @HUNTER-wp3mc Před 3 měsíci

    6:01 sir dy^/dz stands for ?

  • @rahulme
    @rahulme Před rokem +1

    What if we keep sigmoid as activation function and increase learning rate.. eventually it will make 5hose values higher ..so will it helpful by keeping layers same and just increase learning rate. ?

    • @ShubhamMitkari
      @ShubhamMitkari Před rokem +1

      Increasing learning rate will prevent model from converging easily

    • @rahulme
      @rahulme Před rokem +1

      @@ShubhamMitkari oh ok bro thanks .. i thought if we increase values then it will prevent vanishing gradient as values becomes higher ...

  • @zkhan2023
    @zkhan2023 Před 2 lety

    Sir random_state ka batadain ya kis kay liya hota hy ?

  • @ali75988
    @ali75988 Před 8 měsíci

    11:55 Badshaho, tensor board and call backs bi rehte.

  • @zainulabideen9758
    @zainulabideen9758 Před 2 lety

    Bhai next video ks din upload hogi ?

  • @elonmusk4267
    @elonmusk4267 Před 20 dny

    god

  • @ahmadtalhaansari4456
    @ahmadtalhaansari4456 Před 11 měsíci

    Revising my concepts.
    August 12, 2023😅

  • @vanshshah6418
    @vanshshah6418 Před 2 lety

    sir, please upload videos

  • @vanshshah6418
    @vanshshah6418 Před 2 lety +2

    Sir approx how much time would it take to complete this course?

  • @SS-yb1qd
    @SS-yb1qd Před 2 lety

    Sir if u want more views and likes then start some bollywood channel. Is channel ke subscribers intelligent hai jo kuch constructive build karenge. If u want to earn money from this channel we are ready to donate money or launch some course like krish naik.

  • @ABHISHEKKUMAR-gv4di
    @ABHISHEKKUMAR-gv4di Před 2 lety

    While explaining Exploding gradient how come 1-100 = 99 ? . It should be -99 hence weights will keep becoming smaller and smaller i.e more negative