Understanding Backpropagation In Neural Networks with Basic Calculus
Vložit
- čas přidán 22. 05. 2024
- This video explains Backpropagation in neural networks and deep learning with basic knowledge of Calculus. In machine learning, backpropagation or backprop is a widely used algorithm for training feedforward neural networks. Backpropagation computes the gradient of the loss function with respect to the weights of the network. Therefore, this video explains how to compute partial derivatives and the gradient of multivariable functions before considering a simple neural network model.
#Backpropagation #NeuralNetwork #DeepLearning
I couldn't really understand what a computational graph was until I watched this video. Thank you very much.
Thank you!
Thank you again! This was incredibly simple to understand and exactly what I was looking for.
I cannot express how thankful I'm !
You can present complex information in simplified manner, many thanks!
the best session tha i can recommend to anyone. I studied computer science and i am not good at math and this was really a saver to coninue on.
Nice session
Very helpful
You're the goat
very good thanks
Thank you!
The only video that explained me everything
Thank you so much!
I love you brother your teaching
thanks for meaningful describe
Thank you!
Thanks!! It was very useful
Thanks!
Can you explain about optimizer state
Can you please do a video on different cross validation techniques in machine learning
Great suggestion, thanks!
What is phi with a1? Is it activation function.?
at 12:18 at the bottom right corner i think the matrix with the xs should be a 1 x 3 matrix and not a 3x1 matrix, otherwise the matrix multiplication wont work
Both vectors x and w are 3x1. Note that we are using the transpose of W, which means that the result would be (1 x 3) x (3 x1), which is scalar.
Do i need to understand all this to use neural networks? I'm scared shitless but my curiculuum says i have to learn it