Video není dostupné.
Omlouváme se.
Backpropagation in Deep Learning | Part 1 | The What?
Vložit
- čas přidán 18. 08. 2024
- In this video, we'll break down the fundamentals of Backpropagation, a key concept in neural networks. Join us for a simplified explanation and demystify the process that makes deep learning possible.
Video on Gradient Descent
• Gradient Descent From ...
Video on Forward Propagation
• Forward Propagation | ...
============================
Do you want to learn from me?
Check my affordable mentorship program at : learnwith.camp...
============================
📱 Grow with us:
CampusX' LinkedIn: / campusx-official
CampusX on Instagram for daily tips: / campusx.official
My LinkedIn: / nitish-singh-03412789
Discord: / discord
👍If you find this video helpful, consider giving it a thumbs up and subscribing for more educational videos on data science!
💭Share your thoughts, experiences, or questions in the comments below. I love hearing from you!
✨ Hashtags✨
#Backpropagation #NeuralNetworks #DeepLearning #MachineLearning #AI #LearningBasics #SimplifiedLearning
⌚Time Stamps⌚
00:00 - Intro
02:40 - What is Backpropagation?
09:06 - Step by Step Explanation
53:53 - Outro
Cancelled Netflix and Hotstar subscriptions. Binge watching CampusX these days! Magnificent 100/10 ratings!
Me too 😂Sir is really an amazing teacher. Itna clarity aur interesting way me bolte hai movie se kam nahi hai😁May god bless him for his selfless work🥰
Exactly 😊@@user-jc8kr7jt8v
😅😅😅
Me too
Sir just complete Deep learning series the way you teach is magnificent, eventually people will find this treasure soon,just keep doing good work
Yaa bro, but I am stuck in NLP playlists. I follow bhaiya video last one year and I don't understand from other teacher's teaching that's why I commented.
I was not able to understand this from my professors in fancy universities in UK. A big salute to you Sir
Waah beta
paisa hai bhaii
Started with a small doubt in hard margin problem in SVM and saw the whole ML playlist on 2x in 2 days. It connected all the missing dots or missed concepts like a story.
I am perusing MS in data science still never found such a good relation of deep learning and ml and such a concise explanation. Hats off. 🙌🙌
Day 4 and I am in the middle of revising deep learning concepts with such a good explanation
Sir your 1hr video feels like 10minutes and it's satisfying as like we are watching a interesting web series thanks a lot the best teacher 🙏
He is the best teacher. His way of teaching is outstanding and better than all the universities of Pakistan.
Niteshji...I am speachless to appreciate your efforts for us...
Kabhi nhi socha tha ki higher level study me bhi itne achhe teacher mil sakte hai...
Aap ka work awesome... And unique hai...
He is the best teacher. His way of teaching is outstanding and better than all the universities of Canada.
Sir, what are you made up off ?? inch by inch .. point by point, you explained this concept just like baby footsteps ... Sir ..in my entire 30+yrs of learning journey i never ever seen such a teacher / mentor ... I am mean YOU ARE GOD !! :-)
ab toh buddha ho gya
कोई शब्द नहीं है, इतना बेहतरीन explanation.. a lot of thanks sir
Lectures are justt greatt , i mean jst great .....41:25 nd 42:02 , inplace of O21 , there should be O12, btw u can ignore this small thing also as i was doing it on my own , so found out nd wanted to comment something nd finding even this little error is so hard cz your teaching is just Mind Blowing . Thanku for being there nd Happie Dayiiii , i will continuee......Sayonaraaa💃💃💃💃
Yes I also catch this
was searching this comment
Nitish Sir, can't thank you enough for creating this masterpiece of a playlist for free. It's the best resource. Will always be indebted to you.
The greatest explanation of the Backpropagation and Chain Rule, Thanks a lot.
Never seen such a simple explanation for back propagation. You are an exceptionally gifted teacher. Thank you for your hard work and clear teaching Nitish Sir.
The way you explained the whole thing is literally awesome ! With this video am sure every student will understand the backpropagation clearly with an insightful way.
See how things like partial derivatives and derivatives are works in practical and solving real world problems . This man telling us by explaining them so easily. Big like to you sir.❤
i was so scared to learn backprop. But his video is too easy and too good. Excellent! 1000/10
Big fan of this deep learning series and the way you explain. Please upload daily in this series so I can finally get an internship 😂😂😂
Superlative Explanation. I have gone through many videos regarding back-prop, it is the best explanation by far. Thanks Nitish ji
from machine learing to deep learning
great and full detail stuff
I have seen all CZcams Channels for backpropogation but yours explaination is best.
Best Data science teacher in the world. Period!!
you are the BEST teacher on youtube.
best explanation of back propogation so far truly good way of teaching
best video on Backpropagation sir thanks to you.
Just one word, Fantastic! Thanks a lot!
CampusX is an Addiction🙏😍.
You lecture is not boring but like a story.. ❤❤❤❤❤
I found your video on my final year project sir. since then I started following and learning from you. Still many things to learn.
Best & clear compilation of all the important loss functions. Thank you for this video.
I am very happy to see such detailed derivation: at 41:25, it should be O12, by typo you kept O21.. please have a look
Absolutely amazed by your teachings!!
Thank you so much sir for such clear explanation, the way you explained backpropagation is literally very awesome.
Thanks for the wonderful explanation!
Mind blowing, now everything clear in backpropagation
I like your teching method .. thank you sir .. you explain very best way .... you give ans. to every why question and i like it..thank again!!
Thanks for this much clarity ! Very very grateful to your efforts
Great ,Hats off to you Sir ,your teaching style is really awesome 🙏🙂
Thanks, sir for this best content. Now I am understanding Backpropagation.
Very well explained. I am from Java background and understood Deep Learning concepts so far very good.
your explanation is amazing, sir 🖤
My good luck! i finally find the Gold mine❤
Sir ye sb prhny k bawjood b apse prhny ka apna mza ata he thanks aloot for this great content.
Also a request is that please make some videos in future on the transformers using hugging face i really need this Thanks.
moj kardi sir. dil garden garden sa ho gaya😊
there isa mistake at 42:02, it should be o12 instead of o21, for yhat equation.
Thank you sir 😇
Thank You Sir.
What an explanation!!!!!!!!!!!!!
You are going to be famous soon remember my words .the way you explain things it takes so much effort❤
Guys, please watch the gradient descent wala video first cause it is very important
one of the best explanation sir
you are a legend sir. seriously you made tghings soo easy to understand.. Hats off and thanks for all the knowledge.
First teacher in my life , whose thinking exactly like me to understand any topics in terms of only what why and How.
Thanks for your way of teaching sir.
This man take u in deep in every topic. Hatts off 🎉🎉🎉
awesome explanation.. I don't think anyone can explain this perfectly. hats off.
Very well explained,
Sir! you are great!!!
One of the best teachers🙏🙏
Bhaiya, two requests first is "Our Community" and second one is "NLP Playlist"
wahh...really sir hats off to you...what a explanation ❤❤❤❤❤❤❤
Niteshji, I'm heartly grateful and thankful to you teach us. While teaching derivatives to find update value of weight & bias, if you would used real values examples for calculation then it is more beneficial to us for the purpose of understanding.
sir nice explanation for back propagation but i request pls upload video regularly
excellent sir
Thank you so much sir 😊❤
Amazing video . Nice way of explanation . Thanks Alot Sir for this Amaziing Video
Just awesome sir🎉❤
now that's what a lecture from first principles called
You are exceptional, Great teaching skills. Thanks sir
Well explained sir 🙏🏻
No one have ever explained back propagation like u🔥
Great explanation. Thank you so much for this. Keep up the great work.
Incredible!!!!!!!Hats Off to you...
best explanation ever. thanks thousand time for your effort to clear such complex thing in such wasy way to us :)
thanks sir for simple and wonderful explanation
unparalleled explanation. Amazing
CZcams 's best video, even better than Andrew NG
What a explanation.
Thank a lot😀
amazing explanation for backprop , thanks sir 🙌🏻
Amazing!!🤯🤯
Best Ever!!
Legend of Deep learning
Very good tutorial. Thank you very much for such a good tutorial.
25:55 What is the reason behind finding the gradient?
48:30 What is back propagation algorithm?
damm that was too ezzzzz .. you made it look like so simple anyone can understand. again thank you so much for your hard work.
@28:00 -- Chain Rule Of Differentiation (Computing Gradient of the Loss Function)
such a clear explanation!!
Crystal clear concept
Salute sir...
Thanku so much sir for delivering a great session.
Thank you so much for explanation sir 🙏
@CampusX
I have a question regarding the calculation at 43:16 in your video. When determining the derivative of O11 with respect to W1_11 , you seem to be using the pre-activation value. However, shouldn't O11 be the post-activation output (i.e., after applying the activation function)? Thank you very much for your efforts.
amazing
great
This was THE BEST
"If math is hard, it means you studied with the wrong teacher."
Now I feel deep learning is not so hard .
Wonderful explanation sir.
Whattttaa great explanation, thank you Sir! 🙏❤
Thank you sir you explained in detail 🔥🔥🔥
In my hidden layers if I had used Sigmoid function then in the output O11 & O12 I should have derivative the Sigmoid function right?
O11-> 1/(1+e^-z)--> w11*IQ + w21*CGPA+ bias term.
Huh? anyone?
At 33:36 , shouldn't dy hat/ db21 =0 since differentiation of constant is zero ?
Goddddd level teacher ❤️🤌🏻