Deep Neural Network Python from scratch | L layer Model | No Tensorflow

Sdílet
Vložit
  • čas přidán 27. 07. 2024
  • We will create a Deep Neural Network python from scratch. We are not going to use Tensorflow or any built-in model to write the code, but it's entirely from scratch in python. We will code Deep Neural Network with L layers in python from scratch.
    This video is for those enthusiasts who love to know under-the-hood details behind how things work. You can directly use the TensorFlow model to create a Deep Neural Network, but if you are curious to know how things work in python from scratch, then this video is for you.
    Understanding Deep Neural Network in Python from scratch helps you learn how deep learning actually works and gives you confidence in understanding Machine Learning.
    And if you have followed my playlist on Neural Network, then writing this code will be super simple for you. I have tried to explain a very difficult code in a simple manner, so please let me know in the comments section what you feel about this video.
    ➖➖➖➖➖➖➖➖➖➖➖➖➖➖➖
    Timestamps:
    0:00 Coming Next
    0:30 Intro
    3:14 Overview
    6:32 Initializing Parameters
    14:57 Forward Propagation
    23:36 Cost Function
    26:22 Backward Propagation
    33:10 Update Parameters
    34:23 Complete Model
    40:36 Improving Model Look
    48:44 End
    ➖➖➖➖➖➖➖➖➖➖➖➖➖➖➖
    📕 Complete Code and Assignment: github.com/Coding-Lane/Deep-N...
    📕 Neural Network Playlist: • How Neural Networks wo...
    ➖➖➖➖➖➖➖➖➖➖➖➖➖➖➖
    ✔ CNN Playlist: • What is CNN in deep le...
    ✔ RNN Playlist: • How Neural Networks wo...
    ✔ Logistic Regression Playlist: • Logistic Regression Ma...
    ✔ Linear Regression Playlist: • What is Linear Regress...
    ➖➖➖➖➖➖➖➖➖➖➖➖➖➖➖
    If you want to ride on the Lane of Machine Learning, then Subscribe ▶ to my channel here: / @codinglane

Komentáře • 42

  • @hamafel7559
    @hamafel7559 Před 8 měsíci +3

    Dude....your videos are FABULOUS!!
    keep going!!we need you!!

  • @Felloggs
    @Felloggs Před rokem +2

    Bro, your content and you as well are AWESOME.
    Liked and subbed, keep it up!

  • @lecturesfromleeds614
    @lecturesfromleeds614 Před 5 měsíci

    Straight forward and to the point! Good video

  • @ikrameounadi7075
    @ikrameounadi7075 Před rokem +1

    hey , great playlists , i can now say i can really understand deep learning , can you please make a video explaining the perceptron algorithm and its complexity along with kernels ?!

  • @lokendrakumar6212
    @lokendrakumar6212 Před 4 měsíci

    Bro this is the simplest explanation in whatever I have seen

  • @soumyadeepsarkar2119
    @soumyadeepsarkar2119 Před 6 měsíci

    thanks a lot bro your videos really helped me

  • @threetime-ne2dc
    @threetime-ne2dc Před 7 měsíci

    thank u, it the best and clear vedio I have watched😍U an extremely handsome man!!!

  • @subhambasuroychowdhury9698

    Awesome Video man!!!

  • @matthiasblumrich6802
    @matthiasblumrich6802 Před 5 měsíci +1

    You and Josh Starmer (StatQuest) totally demystify DNNs. Thanks!!!

    • @CodingLane
      @CodingLane  Před 5 měsíci +1

      Hey.. thats a big compliment. Thanks!

  • @user-or2gv8zf4l
    @user-or2gv8zf4l Před rokem

    Hi, Mr. Jay Patel!
    Thanks a lot for such explaining!
    Why you don't use derivative in the output layer (AL) for sigmoid function during backward pass?
    Can we state that the weights of the last layer (WL) learn without taking into account back pass of the output error (AL-Y) through sigmoid?
    If yes, why you and other guys don't use it?

  • @divyagarh
    @divyagarh Před 5 měsíci +1

    Good stuff. No videos for a year? please keep uploading. Thank you

  • @jatingupta3443
    @jatingupta3443 Před 3 měsíci

    Thanks bro

  • @svk0071
    @svk0071 Před 2 měsíci

    Very helpful. Please make a playlist for GAN and transformers like you made for CNN.

  • @confuseartist831
    @confuseartist831 Před rokem +1

    you're back....damnnnnn💥💥

  • @ajay0909
    @ajay0909 Před rokem +2

    very decent explanation, would you like to do the same for CNN?

  • @mustafatuncer4780
    @mustafatuncer4780 Před 11 měsíci +1

    I really liked it. I would also like for you to create a video about LSTMs and Transformers (from scratch).

    • @CodingLane
      @CodingLane  Před 11 měsíci +1

      Thanks for the suggestion :)

  • @shubhamraj310
    @shubhamraj310 Před rokem +1

    keep it up✌️💯

  • @prajwalsatannavar4576
    @prajwalsatannavar4576 Před 11 měsíci +1

    Too Good man....Thank u so much!

  • @zonunmawiazadeng5018
    @zonunmawiazadeng5018 Před rokem

    Hey I implemented Backpropagation on CNN.....dL/dZ = dL/dA dot product on dA/dZ according to your video....In my implemented, dL/dA and dA/dZ are both the shape (training size,image height, image weight, channel size)...If this is corrected, how should we dot product it....

  • @rodrigoyepez8123
    @rodrigoyepez8123 Před rokem

    I keep getting this error ( ValueError: shapes (1,1) and (10,1000) not aligned: 1 (dim 1) != 10 (dim 0) ) in reference to this line ( grads["dZ" + str(l)] = np.dot(parameters['W' + str(l+1)].T,grads["dZ" + str(l+1)])*derivative_relu(forward_parameters['A' + str(l)]) ) does anyone know why?

  • @yulupi552
    @yulupi552 Před rokem +1

    great content, would you like to do the same for RNN?

  • @ashraf_isb
    @ashraf_isb Před 10 měsíci +1

    Thank you so much sir, very much helpful 🙂

  • @ongyinren3secv96
    @ongyinren3secv96 Před rokem

    I had a problem with the output in Spyder that it only show one iteration which is
    iter:1 cost: 0.697567606727616 train_acc:0.65 test_acc:0.3
    ==
    May I know what possibly is the error?

  • @uchindamiphiri1381
    @uchindamiphiri1381 Před rokem

    is this video suitable for beginers? If not recommend me what to watch before jumping to this

  • @tungvuson4682
    @tungvuson4682 Před 11 měsíci

    hi. what is your dataset name

  • @niladrichakraborti5443

    Please make a video on COVID-19 detection using chest X Ray

  • @chandut1296
    @chandut1296 Před rokem +1

    Hi bro, try to do the videos on Pre-trained hugging face transformers

    • @CodingLane
      @CodingLane  Před rokem

      Thanks for your suggestion… will try to make a video on it

  • @algorithmo134
    @algorithmo134 Před měsícem

    You made a small mistake while typing the code for derivative_tanh(x) function.
    ✔ The correct code will be :
    def derivative_tanh(x):
    return 1 - np.power(np.tanh(x), 2)

  • @pranavgangapurkar195
    @pranavgangapurkar195 Před rokem +1

    you are SVNIT passout na?

  • @TheCodingChamelion
    @TheCodingChamelion Před 8 dny +1

    copied from coursera

    • @CodingLane
      @CodingLane  Před 6 dny

      Haha.. yeah, I have learned from that only. It's a very good source tbh.
      I have made a video on CZcams, bcoz they don't have any video of it.