Tutorial 9- Drop Out Layers in Multi Neural Network

Sdílet
Vložit
  • čas přidán 23. 07. 2019
  • After going through this video, you will know:
    Large weights in a neural network are a sign of a more complex network that has overfit the training data.
    Probabilistically dropping out nodes in the network is a simple and effective regularization method.
    A large network with more training and the use of a weight constraint are suggested when using dropout.
    Below are the various playlist created on ML,Data Science and Deep Learning. Please subscribe and support the channel. Happy Learning!
    Deep Learning Playlist: • Tutorial 1- Introducti...
    Data Science Projects playlist: • Generative Adversarial...
    NLP playlist: • Natural Language Proce...
    Statistics Playlist: • Population vs Sample i...
    Feature Engineering playlist: • Feature Engineering in...
    Computer Vision playlist: • OpenCV Installation | ...
    Data Science Interview Question playlist: • Complete Life Cycle of...
    You can buy my book on Finance with Machine Learning and Deep Learning from the below url
    amazon url: www.amazon.in/Hands-Python-Fi...
    🙏🙏🙏🙏🙏🙏🙏🙏
    YOU JUST NEED TO DO
    3 THINGS to support my channel
    LIKE
    SHARE
    &
    SUBSCRIBE
    TO MY CZcams CHANNEL

Komentáře • 164

  • @forever-fz1hk
    @forever-fz1hk Před 4 lety +29

    krish sir just one thing to say...i too teach myself sometimes to school children,the thing is the effort you are putting in making these videos at free of charge is commendable...May god bless you sir..I am gaining confidence too after seeing ur videos and and thus becoming a data scientist

  • @snehalbm
    @snehalbm Před 2 lety +7

    You are the mentor every aspiring data scientist needs, Thanks!!

  • @smitirashmiguru7649
    @smitirashmiguru7649 Před 4 lety +3

    Love the Deep Learning Series. Great Learning !!

  • @anujeetswain7368
    @anujeetswain7368 Před 4 lety +2

    This deeplearning series is extremely good.

  • @shivangirastogi9723
    @shivangirastogi9723 Před rokem +5

    Thanks for putting your efforts in making these in-depth videos which clarifies concepts in detail. Your videos are helping students like me who are very new to the ML and AI field.

  • @ebisaabebe615
    @ebisaabebe615 Před 3 měsíci

    I am Msc. student from Ethiopia, Really to tell you the fact I have learnt a lot from your videos. May God bless your mind!!

  • @laxminarasimhaduggaraju2671

    Just I can see ur face is full of happiness when u explains a concept
    I guess u r like 🙏🙏

  • @maddybharathi
    @maddybharathi Před 4 lety

    You have a knack of making things short and simple and easy to grasp :)

  • @nandinisarker6123
    @nandinisarker6123 Před 3 lety

    I found it extremely useful, easier to understand than many known experts

  • @gooopin
    @gooopin Před 4 lety +2

    Thanks for the sessions... These are precise and organized...

  • @grace30
    @grace30 Před 3 lety +3

    Really Like the way you explain! I have just completed Udemy Bootcamp and you are definitely reinforcing what I have learned. Keep up the good work!

  • @theforgottenhealth3244
    @theforgottenhealth3244 Před 3 lety +1

    Great service. Amazing Explanation!!

  • @shosad100
    @shosad100 Před 3 lety +1

    Krish Sir you are my favorite Teacher...your lessons and explanation's are simple and easy to understand , me like B grade student also can understand the concepts. Thank you Sir.

  • @shaz-z506
    @shaz-z506 Před 5 lety +18

    That's the good video Krishna, I never thought about the random forest playing a similar mechanism when the first time I was studying dropout. good, you've cleared my concept with this video. Thanks!

  • @elmoreglidingclub3030
    @elmoreglidingclub3030 Před 3 lety +1

    Great stuff. But I have to listen several times to understand given our different dialects. Much appreciation for your work and explanations!! Excellent!

  • @mgreek31
    @mgreek31 Před rokem

    The effort in these Videos !!!
    Thanks Krish !!!

  • @marijatosic217
    @marijatosic217 Před 3 lety

    Great as always! Thank you :)

  • @priyasingh-zd1wm
    @priyasingh-zd1wm Před 4 lety

    Such awesome content and explanations!!!

  • @davidhakobyan6377
    @davidhakobyan6377 Před 3 lety

    You explain very good! Thank you!

  • @RanjitSingh-rq1qx
    @RanjitSingh-rq1qx Před rokem +2

    I watched 10 videos but yet i didn't code anything, but i am sure whenever I will code. I will be perform in more clearly.because these are videos are focusing on more basics and defining the more depth of ANN. Thank you so much sir. 🥰🥰😘🇮🇳🇮🇳

  • @Fatima-kj9ws
    @Fatima-kj9ws Před 3 lety +1

    Great explanations, thank you very much sir

  • @paullan-learning-read-dev7040

    Thank you. Much easier to understand than the one by Andrew Ng.

    • @manishsharma2211
      @manishsharma2211 Před 3 lety +12

      But can't ignore the fact , that he is God in AI

    • @nabiltech1366
      @nabiltech1366 Před 3 lety +2

      Do u take and finish Andrew Ng course?

    • @MrBemnet1
      @MrBemnet1 Před 3 lety

      @@nabiltech1366 half way . did you finish

    • @nabiltech1366
      @nabiltech1366 Před 3 lety +1

      @@MrBemnet1 No bro.The way he teach is very complicated to me.So i decide to learn a new way.When i have a little bit knowledge that i understand,i will try to retake the course so it will be easier than before.What about u?

    • @MrBemnet1
      @MrBemnet1 Před 3 lety

      @@nabiltech1366 I dont some of the concepts right a way.i will check other resources then come back and view it again. I will finish it everything within 2 weeks

  • @AmitYadav-ig8yt
    @AmitYadav-ig8yt Před 4 lety +5

    Thank you very much, You have been an angel for me. Please upload a video on the theory part of SVM, K-Means or other unsupervised ML. Thanks a lot once again. Hari Om

  • @vikshukla44
    @vikshukla44 Před 4 lety

    Sir you are amazing! , you have cleared everything.

  • @pedramdabaghian1329
    @pedramdabaghian1329 Před rokem

    Thank you. It was so helpful.

  • @vantuannguyen4337
    @vantuannguyen4337 Před 2 lety

    i really love your energy

  • @manjularathore1076
    @manjularathore1076 Před 4 lety +1

    Hi Krish, Thanks for making such nice videos and excellent explanation. Finally I have found somethingl I was looking for better understanding of deep learning.

  • @hokapokas
    @hokapokas Před 5 lety +1

    Good work as usual krish... Awaiting its implementation 🙏🙏

  • @smarthbakshi7041
    @smarthbakshi7041 Před 2 lety

    This man makes ML a cakewalk!

  • @pranjalijoshi6114
    @pranjalijoshi6114 Před 2 lety

    your all videos are very useful ...thanks alot for this good work

  • @arohawrami8132
    @arohawrami8132 Před 7 měsíci

    Thanks a lot Krish for your best explanation.

  • @fthialbkosh1632
    @fthialbkosh1632 Před 4 lety

    Thanks a lot, sir, very good explanation.

  • @sandipansarkar9211
    @sandipansarkar9211 Před 4 lety +1

    Hello Krish.Came to know about the use of random forest in deep learning.Thanks

  • @adityashewale7983
    @adityashewale7983 Před 11 měsíci

    hats off to you sir,Your explanation is top level, THnak you so much for guiding us...

  • @Amanullah-lt6fq
    @Amanullah-lt6fq Před 2 lety

    I am watching your videos from few months and I learned a lot, your channel deserve subscription, I subscribed your channel

  • @wajidiqbal5633
    @wajidiqbal5633 Před 2 lety

    very well explained. thankyou

  • @ameygirdhari8703
    @ameygirdhari8703 Před 3 lety

    simple and clear explanation

  • @pankajverma-sw9oz
    @pankajverma-sw9oz Před 2 lety +1

    i was alwasy confuse about deep learning beacuse of u i got clarity

  • @firstkaransingh
    @firstkaransingh Před rokem

    Great explanation 👍

  • @9971916866
    @9971916866 Před 3 lety +2

    Thank you Krish for the video, this is excellent!! One question, drop out will be applied at each epoch, then how does it combine the results from all the epoch?

  • @joseguilherme5008
    @joseguilherme5008 Před rokem

    Great video 👏

  • @debopamsengupta4409
    @debopamsengupta4409 Před 4 lety

    Hi Krish, great work, real smooth and informative explanation

  • @adityagamingchanneltv9041

    Your lectures are superb

  • @shashwatdev2371
    @shashwatdev2371 Před 3 lety +2

    I have a doubt -
    On every iteration drop out ratio of any particular layer remains same or not? If not then do we take average to multipy with weights for test data ?

  • @anujsinha12
    @anujsinha12 Před 4 lety

    Hello @Krish Naik, You mentioned in Video that for test data w should be multiplied by P. Do we need to write a code for that in Model ? Does it happens aromatically?

  • @urwahmunir9636
    @urwahmunir9636 Před 4 lety +1

    Extraordinary teaching style step wise.You made all my concepts clear , Can you please add some practical implementation of neural network models in which all these techniques can be used. like dropout, loss function , learning rate , regularization , optimizer in one model implementation..Thanks in advance...

  • @sukumarroychowdhury4122

    Krish: You are the very best trainer

  • @Aliabbashassan3402
    @Aliabbashassan3402 Před 4 lety

    thank u from Iraq .. Good Job brother

  • @AKHILESHYADAV-ig7uv
    @AKHILESHYADAV-ig7uv Před 3 lety

    It's really very good lecture series

  • @AbdulRehman-hg9es
    @AbdulRehman-hg9es Před 3 lety +1

    Great effort Krish! I like your passion. I have a one confusion about drop-out ratio. Why are you using drop-out ratio of 0.5 for input layer ? According to my knowledge that should be higher (i.e 1.0 or 0.9).

  • @aakashnishad7048
    @aakashnishad7048 Před 5 lety +2

    Thanks Krish

  • @dnakhawa
    @dnakhawa Před 4 lety

    You teach very well... Gr8 stuff about Data Science in your channel. Thanks Harish!

  • @michaelloturco5584
    @michaelloturco5584 Před 2 lety

    Thank you for this excellent explanation! could you link the original research thesis you mentioned? (or maybe i'm just not finding in description)

  • @vishalvaibhav9697
    @vishalvaibhav9697 Před 4 lety +4

    Hello Krishna, first of all thanks you so much for the videos as lot of my queries are getting cleared up by watching your videos. I have a better understanding of Neural Networks now with all the maths behind it. I have one query though for this particular video : What is Batch Normalization in Neural Networks and how does it help in preventing over-fitting problems in a neural network?

  • @sharadkolse6871
    @sharadkolse6871 Před 4 lety

    Best explained:)

  • @ParthivShah
    @ParthivShah Před 10 měsíci

    thank you sir.

  • @vipindube5439
    @vipindube5439 Před 4 lety

    thank you

  • @pawansharma-ij7kg
    @pawansharma-ij7kg Před 2 lety

    Nice Explanation

  • @anonim5052
    @anonim5052 Před 6 měsíci

    great!!!

  • @pankajkumarbarman765
    @pankajkumarbarman765 Před 4 lety

    Sir you are great 💖

  • @gopalakrishna9510
    @gopalakrishna9510 Před 4 lety

    sir i think your enjoying this teaching ?
    your expressions indicating you are enjoying the teaching ...

  • @kuskargupt2887
    @kuskargupt2887 Před 4 lety

    Sir as we are randomly selecting some features or neurons, then those are being updated according to that set of neurons in that particular FP and BP, so how come the model is going to predict the right answer when all the neurons are activated together for Test data as we have trained the weights of the neurons when there where less number of the activated neuron, like how, the model will sum up all the weights to give the right prediction(with least error).

  • @abdulqadar9580
    @abdulqadar9580 Před 2 lety

    Amazing Sir

  • @vaibhavhariramani
    @vaibhavhariramani Před 3 lety

    I just have a little query if we keep activating and de-activating neurons while training
    doesn't it cause overfitting when testing with all neurons activated at once which were trained in some different combinations during training

  • @zx3215
    @zx3215 Před 4 lety +1

    In your sketch - did you really drop a couple of inputs out? Is this allowed in dropout approach?

  • @tejateju6303
    @tejateju6303 Před měsícem +1

    The video explains the concept of dropout layers in deep neural networks, which helps prevent overfitting by randomly deactivating a subset of neurons during training.
    Key moments:
    00:00 Artificial neural networks with many weights and bias parameters can lead to overfitting issues, dropout regularization helps prevent overfitting by randomly dropping units during training.
    -Explanation of overfitting in deep neural networks due to excessive parameters and the need for regularization techniques like dropout.
    -Comparison between underfitting in single-layer neural networks and the role of multiple layers in preventing underfitting in deep neural networks.
    -Introduction to dropout regularization as a technique to prevent overfitting by randomly dropping units during training, with a reference to the 2014 thesis by Srivastava and Hinton.
    03:54 The video discusses the concept of dropout layers in neural networks, where a subset of features or neurons are randomly deactivated during training to prevent overfitting and improve model generalization.
    -Explanation of how dropout layers work in neural networks by randomly deactivating a subset of features or neurons during training to improve model generalization.
    -Comparison of dropout layers in neural networks to the concept of selecting subsets of features in random forests to create diverse decision trees for better model performance.
    07:25 Dropout layer in neural networks randomly deactivates some neurons and activates others during training to prevent overfitting, similar to random forest's feature selection and majority voting. Test data connects all neurons without deactivation or activation, using weights multiplied by dropout probabilities for prediction.
    -Comparison of dropout layer with random forest for feature selection and majority voting to prevent overfitting in neural networks.
    -Explanation of how test data is handled in dropout layer, connecting all neurons without deactivation or activation, and using weights multiplied by dropout probabilities for prediction.
    -Selecting the dropout ratio (p-value) through hyperparameter optimization to prevent overfitting in deep neural networks, with a recommendation for p-value above 0.5.

  • @lol-ki5pd
    @lol-ki5pd Před rokem

    just a question . during back propogration, for each neuron we get updated weights. Now when we back propogate to starting, and again random starting feature points are chosen, what happens to back propogated weights?

  • @suyash.01
    @suyash.01 Před 3 lety

    helpful!!

  • @sauravkr.mahato
    @sauravkr.mahato Před 2 měsíci

    how simply he explained it .

  • @mohd.faizan3003
    @mohd.faizan3003 Před 3 lety

    Sir I have a doubt that when the neurons are randomly selected base on p value then for next epochs from which neurons the random selection which will performed activated ones or all of them

  • @shubhamchauda425
    @shubhamchauda425 Před 4 lety

    i have a question. we have to add different drop layers for different layers or we have to add once for all layer ?

  • @midhileshmomidi2434
    @midhileshmomidi2434 Před 4 lety +1

    Hi Sir,
    I have a doubt.
    If we take p=0.5 half of the features which will be deactivated at 1st epoch will be reactivated in 2nd epoch and same goes on for other features in upcoming epochs as well
    Please explain

  • @babbarutkarsh7770
    @babbarutkarsh7770 Před 3 lety

    Can there be a better explaination? Simply perfect!!

  • @prasantimohanty6750
    @prasantimohanty6750 Před 4 lety +1

    I have a doubt.
    In test data which neurons are not activated we are doing p*w but which neurons are activated what will we doing in that case?

  • @jeevanaddepalli5494
    @jeevanaddepalli5494 Před 3 lety +2

    I think during test time we should multiply the weights with keep probability value = (1- dropout rate). Intuitively keep probability means how many % of times we have used that weight or edge or connection to train our NN. please correct me if i am wrong Krish sir.

  • @chaoxi8966
    @chaoxi8966 Před 2 lety

    Hi, Sir. I would like to know in each epoch of training, does dropout have relations to batch_size?

  • @koushikkonar4186
    @koushikkonar4186 Před rokem

    Hi, In this video, when we are going to apply for test data...what will be the weight of deactivated neurons

  • @Sovereignl55
    @Sovereignl55 Před rokem

    Sir if we're dropping some input and also hidden layers,
    It will not affect our output?
    Mean correct predictions

  • @vishalaaa1
    @vishalaaa1 Před 3 lety

    Hi, You did not explain how the exploding problem can be corrected - is it through Same RELU ?

  • @VamsiKrishna-vg6vd
    @VamsiKrishna-vg6vd Před 4 lety +1

    For training data suppose we are ignoring few features and neurons as per the drop out ratio and calculating the weights and with back propagation v r updating the weights. In the second step another set of features and neurons are selected randomly, Now if we are again calculating the new weights that doesn't make sense rights as this will keep on repeating with different random combinations.... Please correct me if I am wrong...Thanks in advance.

    • @vaishnavkrishnan7996
      @vaishnavkrishnan7996 Před 3 lety

      so after all of this is done the best set of features are selected for that particular output value i guess

  • @rahul-wz7rn
    @rahul-wz7rn Před 2 lety

    if we apply drop out ratio is there any chance that the features which are selected first time get selected in second time..or new features get selected.

  • @anindyabanerjee743
    @anindyabanerjee743 Před 3 lety

    krish..you make my life easier

  • @krishnakanthbandaru9308

    sir if in train we drop the x2 and x4 features we won't get weights then while testing how those weights(Unknown) get multiplied with drop out ratio. I did not get that ,please explain ..

    • @amankapri
      @amankapri Před 4 lety +2

      @Krishna : The features x2 and x4 will be dropped out only in 1st epoch. Once the epoch is completed, again it will select 2 other features as per the drop out ratio. Once this loop gets completed, all the neurons in each layer will have some weight with it.

  • @commonboy1116
    @commonboy1116 Před 3 lety

    Super

  • @TheSougata1
    @TheSougata1 Před 3 lety

    Sir all weights will be updated as (P*W) while testing data or the P value will be updated as (P*W) ? Please clear this.

  • @palashchanda9308
    @palashchanda9308 Před 3 lety

    Can you please provide link for the Machine Learning playlist?

  • @uncommon_common_man
    @uncommon_common_man Před 4 lety

    how can we say that the model is underfitted or overfitted

  • @amitghodke838
    @amitghodke838 Před 4 lety

    Can you explain how it is helping to avoid overfitting problem.

  • @debasispatra8368
    @debasispatra8368 Před 4 lety

    Krish i have a doubt. Suppose i have 5 inputs & 5 neurons in my 1st hidden layer. In training time, i have given drop out ratio as 0.5, & due to this suppose 2 inputs & 2 neurons got deactivated. In this case now we have 3 i/p & 3 neuron left, so 9 weights we have to train. But at testing time we have to multiply 'p' value with 25 weights as testing time all i/p & neurons exists. So how to do this?

    • @vaishnavkrishnan7996
      @vaishnavkrishnan7996 Před 3 lety

      i think the drop out ratio for other deactivated neurons in the test set would be 0 i guess doesnt make sense though

  • @sameerherkal9205
    @sameerherkal9205 Před 2 měsíci

    Hi @Krish,
    I got asked in an interview, what if we remove one hidden layer instead of DropOut, wont it be good to remove one Hidden Layer instead of DropOut,
    Can you please help me with the Answer.

  • @shahariarsarkar3433
    @shahariarsarkar3433 Před 3 měsíci

    Please suggest a good reference book for Deep learning.

  • @manikosuru5712
    @manikosuru5712 Před 5 lety +2

    Hi sir, Amazing explanation..
    small doubt..
    while multiplying 'p' value with weight 'w' for test data,do we include(add) bias value with input??

  • @Adinasa2
    @Adinasa2 Před 4 lety

    How do we determine the value of p

  • @RajeshRajesh-sh7zj
    @RajeshRajesh-sh7zj Před 2 měsíci

    In the next iteration, will the deactivated neurons get activated randomly???

  • @ketanchaudhari5642
    @ketanchaudhari5642 Před 4 lety

    hello sir
    just want to ask
    during dropout we will drop few neurons and at the time of testing we will connect them all and will update weights of whole network as (W*P)
    but what are the weights of drop neurons should we take W=1 for drop neurons

    • @benvelloor
      @benvelloor Před 3 lety

      All neurons will get updated with weights as there are multiple forward and backward prorogation. Dropout just prevents a random set of neurons from updating after each forward and backward iteration.

  • @absolutelynobody3837
    @absolutelynobody3837 Před 3 lety

    Wouldn't the weights in testing be w(1-p) rather than wp?

  • @zohaib386
    @zohaib386 Před 3 lety

    Alla👌👌

  • @shiffin_chippe
    @shiffin_chippe Před 4 lety +6

    So when the neurons are reactivated what are their weights?

    • @akashkewar
      @akashkewar Před 4 lety +4

      their weights are the same as before because you didn't update them using backpropagation. You only update the weights corresponding to neurons that are activated at an iteration. So in the next iteration, if we happen to activate the neuron which was not active on the last iteration it's weight will be the same until backpropagation updates it (because that neuron is active now and hence will get updated).

    • @shiffin_chippe
      @shiffin_chippe Před 4 lety +2

      @@akashkewar thanks for the reply after 8 months😃😃😃♥️

    • @akashkewar
      @akashkewar Před 4 lety +1

      ​@@shiffin_chippe :D "Better late than never". I hope you are doing fine in life and don't give up.

  • @mdmynuddin1888
    @mdmynuddin1888 Před 2 lety

    Can you or anyone please provide the thesis or research paper link

  • @umarsadique7166
    @umarsadique7166 Před 3 lety

    How to find dropout layer ratio for my model???