Understanding Backpropagation In Neural Networks with Basic Calculus

Sdílet
Vložit
  • čas přidán 22. 05. 2024
  • This video explains Backpropagation in neural networks and deep learning with basic knowledge of Calculus. In machine learning, backpropagation or backprop is a widely used algorithm for training feedforward neural networks. Backpropagation computes the gradient of the loss function with respect to the weights of the network. Therefore, this video explains how to compute partial derivatives and the gradient of multivariable functions before considering a simple neural network model.
    #Backpropagation #NeuralNetwork #DeepLearning

Komentáře • 25

  • @hieuluc8888
    @hieuluc8888 Před měsícem +2

    I couldn't really understand what a computational graph was until I watched this video. Thank you very much.

  • @asfandiyar5829
    @asfandiyar5829 Před 11 měsíci +4

    Thank you again! This was incredibly simple to understand and exactly what I was looking for.

  • @omnnnooy3267
    @omnnnooy3267 Před 2 lety +2

    I cannot express how thankful I'm !

  • @parisahajibabaee2893
    @parisahajibabaee2893 Před 2 lety +2

    You can present complex information in simplified manner, many thanks!

  • @leonmark2416
    @leonmark2416 Před 11 měsíci +2

    the best session tha i can recommend to anyone. I studied computer science and i am not good at math and this was really a saver to coninue on.

  • @karthikb.s.k.4486
    @karthikb.s.k.4486 Před 2 lety +2

    Nice session

  • @williammartin4416
    @williammartin4416 Před 3 měsíci +2

    Very helpful

  • @ChristopherKennedy-fm2lo
    @ChristopherKennedy-fm2lo Před měsícem +1

    You're the goat

  • @user-db2zx3ox7u
    @user-db2zx3ox7u Před 2 měsíci +2

    very good thanks

  • @esper5429
    @esper5429 Před 3 měsíci +2

    The only video that explained me everything

  • @Fantastics_Beats
    @Fantastics_Beats Před rokem +2

    I love you brother your teaching

  • @sparklexscrewyachivements813
    @sparklexscrewyachivements813 Před 6 měsíci +2

    thanks for meaningful describe

  • @polycarpnalela2297
    @polycarpnalela2297 Před rokem +2

    Thanks!! It was very useful

  • @sudheeshe1384
    @sudheeshe1384 Před 2 lety +2

    Can you explain about optimizer state

  • @karthikb.s.k.4486
    @karthikb.s.k.4486 Před 2 lety +4

    Can you please do a video on different cross validation techniques in machine learning

  • @kuldeepkaur-do2ex
    @kuldeepkaur-do2ex Před měsícem

    What is phi with a1? Is it activation function.?

  • @patrikszepesi2903
    @patrikszepesi2903 Před 4 měsíci +2

    at 12:18 at the bottom right corner i think the matrix with the xs should be a 1 x 3 matrix and not a 3x1 matrix, otherwise the matrix multiplication wont work

    • @DrDataScience
      @DrDataScience  Před 4 měsíci +1

      Both vectors x and w are 3x1. Note that we are using the transpose of W, which means that the result would be (1 x 3) x (3 x1), which is scalar.

  • @llothar68
    @llothar68 Před měsícem

    Do i need to understand all this to use neural networks? I'm scared shitless but my curiculuum says i have to learn it