Neural Networks from Scratch - P.4 Batches, Layers, and Objects

Sdílet
Vložit
  • čas přidán 1. 06. 2024
  • Neural Networks from Scratch book: nnfs.io
    NNFSiX Github: github.com/Sentdex/NNfSiX
    Playlist for this series: • Neural Networks from S...
    Neural Networks IN Scratch (the programming language): • Neural Networks in Scr...
    Python 3 basics: pythonprogramming.net/introdu...
    Intermediate Python (w/ OOP): pythonprogramming.net/introdu...
    Mug link for fellow mug aficionados: amzn.to/2KFwsWn
    Channel membership: / @sentdex
    Discord: / discord
    Support the content: pythonprogramming.net/support...
    Twitter: / sentdex
    Instagram: / sentdex
    Facebook: / pythonprogramming.net
    Twitch: / sentdex
    #nnfs #python #neuralnetworks

Komentáře • 952

  • @Yguy
    @Yguy Před 4 lety +637

    I swear I am addicted to these more than Netflix.

  • @AaditDoshi
    @AaditDoshi Před 4 lety +260

    I don't even look at my calendar anymore. My week ends when sentdex drops a video.

  • @soroushe6394
    @soroushe6394 Před 4 lety +304

    I’m glad I’m living at a time that people like you share their knowledge in such quality for free.
    Thank you 🙏🏻

    • @slamsandwich19
      @slamsandwich19 Před rokem +1

      I was going to say the same thing

    • @MeinGoogleAccount
      @MeinGoogleAccount Před 7 měsíci +1

      yes, absolutely. i come from a time where programming meant you buy a book that acually was outdated the moment you bought it.
      thank you 🙂

  • @sayanguha5570
    @sayanguha5570 Před 4 lety +242

    Everytime I see a neural network tutorial they start as "import tensorflow as tf" without giving a shit about basic..but this is a very detailed basic clearing video, truly from scratch...THANK YOU FOR THE GOOD WORK

    • @lucygaming9726
      @lucygaming9726 Před 4 lety +4

      I agree with you, although you can check out Deeplearning.ai on Coursera. It's pretty good.

    • @aleksszukovskis2074
      @aleksszukovskis2074 Před 3 lety +7

      @@lucygaming9726 No thanks . Im too poor for that.

    • @janzugic6798
      @janzugic6798 Před 3 lety +1

      @@aleksszukovskis2074 its free and by andrew ng, the legend

    • @aleksszukovskis2074
      @aleksszukovskis2074 Před 3 lety

      @@janzugic6798 thanks

    • @supernova6553
      @supernova6553 Před 2 lety +2

      @@janzugic6798 you need a coursera subscription ($49/mo) after 7 day trial period regardless of the course being free

  • @sentdex
    @sentdex  Před 4 lety +109

    Errata:
    16:17: initially this anim was incorrect when I recorded. We fixed the anim, but not the audio, resulting in my reading of the incorrect first row of values incorrectly. We're adding row vectors here, so the anim is correct, the words are not. =]

    • @usejasiri
      @usejasiri Před 4 lety +1

      Please clarify the concept of the Gaussian Distribution that you introduced when talking about np.randn

    • @anjali7778
      @anjali7778 Před 4 lety

      if i draw a neural network of 12 inputs imaging into 3 output and
      connect each neurons to the output, there will be 36 lines in total,
      that means there has to be about 36 weights but the weight you took had
      only 12 weights in array, how is that possible ?

    • @mayaankashok2604
      @mayaankashok2604 Před 3 lety +1

      @@anjali7778 He has only 4 inputs to the output layer... therefore number of weights = 4*3 = 12
      If instead , you have 12 inputs ,you will get 12*3 = 36 weights

    • @fincrazydragon
      @fincrazydragon Před 11 měsíci

      Am I wrong, or is there something missing around the 9:08 point?

    • @dragonborn7152
      @dragonborn7152 Před 10 měsíci

      Question: why did we need to transpose weights 2 since they are both 3x3 matrices, index1 of the would equal index 0 right?

  • @Blendersky2
    @Blendersky2 Před rokem +17

    Just imagine if we have tutorials like these on all the AI and Machine learning topics and also on probability and statistics. .. man, every few minutes in the video I try to scroll the video list up and down with the hope that there will be 700 more videos like these but it shows only 7 videos. Amazing work, I will order your book now. Appreciate your dedication and hard work

  • @amogh3275
    @amogh3275 Před 4 lety +26

    Bruh this visualisation... Its unreal🔥

    • @Saletroo
      @Saletroo Před 4 lety +3

      ASMR for eyes, thanks Daniel!

    • @amogh3275
      @amogh3275 Před 4 lety

      @@Saletroo ikr 😂😂

  • @knowit3887
    @knowit3887 Před 4 lety +36

    U r just ... God for teaching programming... I am glad to have u as a teacher... 💪

  • @aoof6742
    @aoof6742 Před 4 lety +79

    I really appreciate you doing this mate, I really wanted to learn Neural Networks and you are explaining this soo good.

  • @prathamkirank
    @prathamkirank Před 4 lety +16

    This is the online classes we all deserve

  • @kaustubhkulkarni
    @kaustubhkulkarni Před 4 lety +6

    I’d kind of given up on understanding ML and NN. Then I saw Neural Networks from scratch and Sentdex CANNOT make this easier. Loving this series.

    • @7Trident3
      @7Trident3 Před 4 lety +1

      I banged my head on numerous videos too. They assume a level of knowledge that was hard to peice together. This series is filling lots of gaps for me. The concepts are starting to jell, this whole field is fascinating!! Kind of empowering.

  • @ramyosama8088
    @ramyosama8088 Před 4 lety +14

    Please continue with this playlist This is hands down the best series on youtube right now !!!

    • @sentdex
      @sentdex  Před 4 lety +6

      No plans to stop any time soon!

  • @bas_kar_na_yar
    @bas_kar_na_yar Před 4 lety +19

    I wish anyone had ever taught me any concept the way you do..

  • @rrshier
    @rrshier Před 3 lety +3

    At about 14:51, where you present the matrix multiplied by the vector, the proper mathematical notation would be to have the vector as a column vector, as well as the output vector being a column vector. This is truly how the matrix multiplication is able to work, because a vector is truly just a matrix where one of the dimensions is equal to 1. Other than that, I have to admit, these are my FAVORITE AI/ML videos yet!!!

    • @pensivist
      @pensivist Před 9 měsíci

      I was looking for this comment. Thanks for pointing that out!

  • @ginowadakekalam
    @ginowadakekalam Před 4 lety +10

    This channel is so good that you'll never find any negative comments

    • @sentdex
      @sentdex  Před 4 lety +5

      They are there sometimes :) but yes fairly rare.

    • @usejasiri
      @usejasiri Před 4 lety

      -comment

  • @jonathantribble7013
    @jonathantribble7013 Před 4 lety +108

    Friend: "So what do you do in your free, unwind, leisure time?"
    Me: "Neural Networks From Scratch"
    Friend: "..."

    • @alexgulewich9670
      @alexgulewich9670 Před 3 lety +5

      Sister: "If that's informative, then what's educational"
      Me: "Glad you asked!" *starts to explain neural networks and basic QP*
      Sister: "NO! Make it stop!" *Never asks again*

    • @stevenrogersfineart4224
      @stevenrogersfineart4224 Před 3 lety

      Story of my life 😁

  • @chaosmaker781
    @chaosmaker781 Před 2 lety +3

    this is better explained and with more quality than any neural network video where the concept is mostly shown just by the code

  • @Gorlung
    @Gorlung Před 3 lety +7

    this is actually the first NN tutorial during which I haven't felt asleep..
    ps. thank you for explaining some of the things twice!

  • @asdfasdfasdf383
    @asdfasdfasdf383 Před 4 lety +1

    You have created one of the best series on this topic I have found on the internet. Explanations include everything, yet you still proceed at a fast steady pace.

  • @aav56
    @aav56 Před 2 lety +12

    I've never learned linear algebra and I'm astounded how simple you made matrix multiplication out to be!

  • @usamanadeem7974
    @usamanadeem7974 Před 4 lety +15

    The thing I love about you is just how beautifully you explain concepts, with immaculate animations and then literally make such complex tasks seem so easy! Gonna make my kids watch your tutorials instead of cartoons one day ♥️😂

  • @prathamprasoon2535
    @prathamprasoon2535 Před 4 lety +19

    This is awesome! Finally, a series on neural nets I can understand easily.

  • @chaks2432
    @chaks2432 Před 3 lety +1

    This is my first time learning about Neural Networks, and you're doing a great job at explaining things in an easy to understand way.

  • @ambarishkapil8004
    @ambarishkapil8004 Před 4 lety +11

    I know I have said this before, but I am going to say it again, and keep on saying it till you continue to make such awesome tutorials. Thank you!

  • @kenbinner
    @kenbinner Před 4 lety +3

    I'm really glad you took the time to break down this concept step by step, will surely reduce the number of headaches in the future!
    Thank you for your great content looking forward to the next one. 😄

  • @harikalatheeswaran9206
    @harikalatheeswaran9206 Před 4 lety +4

    For people watching this video... remember this golden rule :
    Say we have two Matrices A&B..in order to multiply A with B,i.e A.B
    The number of Columns of Matrix A should be equal to number of Rows of Matrix B.
    That's why A.B != B.A
    Amazing video 👍!
    Thanks a lot !
    Keep up the amazing work !

  • @carlossegura403
    @carlossegura403 Před 3 lety +2

    Back when I was learning the concepts behind building a network, most tutorials went straight into the maths, while that is fine - what I wanted to understand was the different compositions from the input to the output. This video was what I was looking for back then before going deep into the theory and methodology. Great content!

  • @littlethings-io
    @littlethings-io Před 3 lety +1

    Just ordered the book - can't wait to dive into it. Thanks you, this is good stuff and a priceless contribution to the evolution of this area of science.

  • @codiersklave
    @codiersklave Před rokem +3

    Still one of the best series on CZcams to learn the basics of neural networks... fast!

  • @keshavtulsyan7515
    @keshavtulsyan7515 Před 4 lety +3

    Feels like learning all the day, it never felt so simple before...thanks a lot 🙏🏻

  • @jsnadrian
    @jsnadrian Před 4 lety

    i can't believe you created this course - absolutely fantastic and wonderfully thoughtful in its layout - thanks so much

  • @Alex-ol9dk
    @Alex-ol9dk Před rokem +1

    I have never bought a book from CZcams before but you will be the first. You’ve deserved it. Absolutely love this work. Please keep it up

  • @Alfosan2010
    @Alfosan2010 Před 4 lety +89

    last time I was this early, Corona was just a beer brand...

  • @shubhamdamani1057
    @shubhamdamani1057 Před 4 lety +9

    Can you please provide a visual representation of how the batches pass along. I mean by using animation using bubbles and lines like you did in the initial videos.

  • @Voyagedudimanche
    @Voyagedudimanche Před 4 lety +1

    Hello! I'am following you for more then 2 years and this is the best course for me! With those explanation of math - it is realy cool. Thank you for this work :)

  • @devinvenable4587
    @devinvenable4587 Před 9 měsíci

    I watching this as a refresher as I studied this topic a few years ago, and I find the context you provide really useful. Thanks!

  • @yabdelm
    @yabdelm Před 4 lety +22

    This is the best series by far I've ever seen. Just what I was looking for. I wonder if you'll get into explaining the why also.
    For instance, often times when I'm watching I do wonder "Why do we even have biases? What function do they serve? How do they enhance predictions? What sort of history/science/neuroscience underlies that and where do AI and neuroscience partways if so? Why does all of this work at all?"

    • @asongoneal28
      @asongoneal28 Před 4 lety +3

      Youssef I really hope @sentdex reads this ;)

    • @naseemsha3010
      @naseemsha3010 Před 4 lety +4

      I think it was explained in a previous video, how biases help making predictions. Check out the last video guys

    • @carloslopez7204
      @carloslopez7204 Před 4 lety +1

      He explained that in previous videos, but no all your questions

    • @yabdelm
      @yabdelm Před 4 lety +1

      @@carloslopez7204 I agree it was explained a bit but I really didn't feel the explanation gave me a deep understanding of the why unfortunately, just a very rough surface level and vague hint of what might be going on.

    • @liadinon1134
      @liadinon1134 Před 4 lety

      I think that now some things, like the biases dont make sense now. But when you get into training(the lerning process) is all start to make sense.

  • @classicneupane6196
    @classicneupane6196 Před 4 lety +13

    Understood batch size finally

    • @sentdex
      @sentdex  Před 4 lety +1

      Glad we could help!

  • @hasneetsingh
    @hasneetsingh Před 3 lety

    Your explanations are so clear, I really appreciate the hard work you've been through to design this series to make such complex topics so much fun to learn :) . Enjoying a lot

  • @peppep4426
    @peppep4426 Před 4 lety +1

    This reminds me of the best TV series ... You finish one episode and look forward to the next ...
    Good job!

  • @JackSanRio
    @JackSanRio Před 4 lety +3

    I pre-ordered the book because this is interesting and I a eager to learn more

  • @lemoi6462
    @lemoi6462 Před 4 lety +6

    The interesting part will be the backward propagation, im really looking forward to this

  • @aamirkhanmaarofi9705
    @aamirkhanmaarofi9705 Před 3 lety

    Watching this playlist is awesome, it made my task very easy. Have been stuck with the implementation of the multilayer perceptron for two days. Thanks

  • @lonnie776
    @lonnie776 Před 4 lety

    You are doing a great job explaining these concepts in a way that is easy to understand. I can't wait for the next part so I am ordering the ebook.
    Great job.

  • @patrickvieira9200
    @patrickvieira9200 Před 4 lety +4

    well finally looks like my linear algebra class was not a waste of time at all

  • @thenotsogentlecat5847
    @thenotsogentlecat5847 Před 3 lety +9

    Sentdex: we're arriving at the sexy parts...
    Python: Oh, yes I am ;)

    • @kelpdock8913
      @kelpdock8913 Před 3 lety

      x = we're arriving at the sexy parts...
      print(x)

  • @bradley1995
    @bradley1995 Před 10 měsíci +1

    I just want to again say thank you so much for these videos. They are top notch. It truly has helped me get a deep understanding compared to what many other "tutorials" have. Plus all this information being provided free. I feel blessed!

  • @jedisenpei855
    @jedisenpei855 Před 3 lety

    Apart from trying to explain neural networks, you just explained the matrix dot product in the most intuitive way I have ever seen. I know how the dot product works by now, but I also remember how much work I had to give in to understand the concept given lectures and texts i had at university. I had to read through some difficult math equations and really think about what the book was trying to tell me, and I also had to go through a lot exercises to really get a grasp of it and remember it, and then you just explained it in 10 minutes and it makes perfect sense, although I had almost forgottes what it was all about. So easy. I wish my teacher had an animation like the one you show at 9:10. Then I wouldn't have had to struggle through the math classes, as much as I did, in my education as an electrical engineer.

  • @dippy9119
    @dippy9119 Před 3 lety +3

    6:09 what's a fitment line? Google isn't helping me.

  • @DRIP-DRIP-DRIP
    @DRIP-DRIP-DRIP Před 4 lety +9

    Never clicked on a video so quickly

  • @asu4908
    @asu4908 Před 2 lety

    Doing gods work, ordered the book a while ago and finally have time to actually dive into this now-thank you so much bro

  • @tymion2470
    @tymion2470 Před 16 dny

    I'm very thankful for this series, I just learn so much new thing, because you're so good in explaining, and there yet 5 videos to watch!

  • @time2learn123
    @time2learn123 Před rokem +3

    Why does the dot product switch inputs and weights when working with batches. e.g when input is a 1D array the calculation in the code is np.dot(weights, inputs) but for batch it is np.dot(inputs, transposed_weights). Why doesnt it work when we transpose the inputs instead? Im sure Im missing something simple. Thanks for the videos they are amazing!

    • @joelgerlach9406
      @joelgerlach9406 Před rokem +2

      Because matrix multiplication is not commutative

    • @MrGeordiejon
      @MrGeordiejon Před rokem

      I think it is the nature of what we are doing - We are taking inputs - applying weights and biases - and delivering outputs. or entering exiting a decision. so we can't use an entrance to a neuron to exit another neuron.
      I think the demonstrations by Harrison are to cement the concept and awareness of ValueError shape()... and he also showed how multiplication works between array and vector (multiplication)
      I went back to lesson 3 for 2 things.
      1. I like inputs being the first entry so 'My doors' are labelled correctly
      2. Use the npArray().T in that example
      If he had not shown us 3 before 4 - I would have found it harder to appreciate transpose() - I don't think I will ever just reverse the args when I am coding this stuff.
      import this

    • @413blaze
      @413blaze Před rokem

      Something that I find interesting that I think might have to do with this is that specifically in the case of 1 dimensional arrays, the shape is different. I am used to thinking of a matrix as rows by columns. For example, [[2,2],[3,3]] , would be a 2 by 2 matrix. 2 rows and 2 columns; however, lets take the example [1,2,3,4] , I would have expected the shape of this to be 1 by 4 ( 1 row and 4 columns) but it is not. The shape of [1,2,3,4] is (4, 1) . So, the way to think about it is by elements in a list of lists. The first entry x in shape (x , y) is how many lists in a list and the second entry y is how many entries within each element or list. In his example, his first inputs [1,2,3,4] the shape is (4,1) and when he put [[1,2,3,4],[1,2,3,4],[1,2,3,4]] the shape became (3,4) and if you are thinking about this in rows and columns that wouldn't be the case. I hope that made some modicum of sense lol

    • @ollie6989
      @ollie6989 Před rokem

      You could perform the same operation by transposing the inputs however keep in mind the matrix rule (A.B)' == B'.A' , e.g. inputs.(weights.T) == (weights.(inputs.T)).T aka. the output of inputs.weights_transposed will be equal to the transposed output of weights.inputs_transposed, the issue with the values probably comes from adding the biases without first either transposing them or transposing this output matrix back, as they will be added in a completely different order.

  • @amogh3275
    @amogh3275 Před 4 lety +13

    16:19 you said the other way around by mistake.. shouldn't it be 2.8 +2, 6.8+2, -0.59+2..

    • @fl7977
      @fl7977 Před 4 lety

      Yeah, that really confuse me more than it should have

    • @bipanbhatta2736
      @bipanbhatta2736 Před 4 lety

      yes. It is called broadcasting.

  • @DMBalchemy
    @DMBalchemy Před 4 lety

    Incredible as always. This one struck a few lightbulbs. Thanks again, Eagerly anticipating #5, I'll have to work through the draft to prep

  • @merth17
    @merth17 Před 4 lety +1

    I can’t wait to see the implementation of backpropagation with the chain rule, it’s so simple when you teach it. Tysm

  • @franky0226
    @franky0226 Před 4 lety +5

    Notification => nnfs P4.
    Me: clicks on the button faster than the speed of light

  • @ahmedyamany5065
    @ahmedyamany5065 Před 2 lety +3

    Great explanation and animation, but in 14:47 [1,2,3,2.5] in python is array which is vector or matrix (4,1) but when you write it in paper or animation you should write in vertical form like column, not row, because [1 2 3 2.5] in animation is matrix (1,4), not (4,1), so we can say every element in array [1,2,3,2.5] is row, 1 is 1st row, 2,5 is 4th row.

  • @TNTeon
    @TNTeon Před 7 měsíci

    Hey just to let you know, this video 3 years later continues to help and encourage new programmers!
    I'm in my freshmen year of highschool doing all gen ed courses, but I started working on this tutorial in my free time and I'm having a blast and actually understanding everything perfectly
    Just wanted to say thank you so much for really helping people like me in our learning of Computer Science and machine learning! These are awesome and super enjoyable!

  • @realbingus
    @realbingus Před 4 lety

    At the point where I had a question, I had not fully watched the video yet. So I commented my question. Literally five seconds later in the video you answer my question in the video.
    I love the series, thanks for doing this!

  •  Před 4 lety +20

    0th! Finally!

  • @afafssaf925
    @afafssaf925 Před 4 lety +4

    You are wayyyyy more buff than it seems by just your face.

    • @sentdex
      @sentdex  Před 4 lety +1

      I'll keep that in mind

  • @garymdmd
    @garymdmd Před rokem

    I am on lesson 4 now - you are such a great instructor, I love learning this stuff.

  • @yuwankumar
    @yuwankumar Před rokem +1

    After many searches I found this playlist! Thank you for making this Gold.

  • @thomasnevolianis8616
    @thomasnevolianis8616 Před 4 lety +4

    import neural_networks_from_scratch as nnfs
    from nnfs import moments
    best_moments = moments(channel='Sentdex')
    print(best_moments[0])
    ''The SEXY part of deep learning''

  • @youngfinn1185
    @youngfinn1185 Před 5 měsíci +1

    i found this gold 3 years later. i noticed at 16:11 when explaining adding matrix to the bias vector. your saying the wrong thing but showing the correct.

  • @gurns681
    @gurns681 Před 2 lety

    Mate, this series is unreal! Love your work

  • @benjaminsteakley
    @benjaminsteakley Před rokem

    Took my five years to find something like your videos in 2022. I dropped put of college from stress and i can finally sit down and try to understand this math. I hope the video which explains linear regression is as good as these four so far

  • @unionid3867
    @unionid3867 Před 2 lety

    jujur saya hampir putus asa mencari tutorial membuat neural network untuk pemula, beruntung saya menemukan video anda, terimakasih banyak

  • @RentnersWerkstatt
    @RentnersWerkstatt Před 19 dny

    Dude!!! I am on video 4 so far. Thus far you are about 99 out of 100 fantastic. Here is what is missing from my perspective of not knowing what or how to build a NN. Your video series starts out with an expectation of programming in some language and that is totally fair, else none would understand anything at all. I do feel that one would need to have a basic understanding of Python mostly because of terminology. To know that a list is an array and batch is simply an array of arrays and such is useful especially if the student's background is in PHP or Javascript. Now here is where I start exploring outside of your videos.... Your start with the raw code of this NN is also fantastic. What I recommend is a bit more basis explanation. For example what is a layer? Why is a layer? You can mention the function or purpose of weights and also mention that they are or can be random as the "learning" part will be the machine testing different weights and biases in itself. You start out also comparing the code to a NN diagram and that is MOST AWESOME, thank you. You also start out with building this code around a scenario of mainboard/case/rack sensors used to determine the failure of a server. So you basically should have a final result of yes/no it will fail. The same as your cat/dog. But one may assume that humanly we are inputing only pics of cat or dog, thus your output can be a binary of isCat and that is all that is required. I would like to see more in line analogies to your sensor scenario as it is a massive benefit to understanding each element of the NN. That is my only advice. Thus far I love it. When these explanations are missing, it is easy to go outside to another resource and ask what a "layer" is and why or what it does, but from a product perspective "your video series" your intent should be to keep me from venturing out as with a bounce rate, you do not want to lose my attention.

  • @NikhilSandella
    @NikhilSandella Před 4 lety

    This is the best channel with the best content, with amazing animation. Clear explanation. I'm in love with this man. :)

  • @anilsarode6164
    @anilsarode6164 Před 3 lety +2

    I think the single array of biases at 16:16 get added to the individual row of the dot product matrix is due to the Python broadcasting. Thanks a lot for this video series.

  • @3alabo
    @3alabo Před 4 lety +1

    One of the best tutorials I have seen on the topic , Saludos de Argentina!

  • @milindprajapat7177
    @milindprajapat7177 Před 4 lety

    Best feeling ever...after seeing sentdex p.4 out...Thank you so much...May god keep you safe from corona virus so that we can get awesome content from you! Love from India ♥️

  • @saisiddhanthgujjari8954

    Amazing content sentdex, the visualizations are just top notch and aid to a much clearer explanation.

  • @FagunRaithatha
    @FagunRaithatha Před rokem

    This content is really good. Thanks for making this simple. I have been binge-watching your videos.

  • @sciWithSaj
    @sciWithSaj Před 3 lety +1

    Thanks a lot
    This will be my first object oriented programming.
    It was kind of daunting for me, but you made it so simple.

  • @super7ace
    @super7ace Před rokem +1

    God level series on Neural Network. Good job and always proud of you buddy!!

  • @dinarakhaydarova4898
    @dinarakhaydarova4898 Před rokem

    I thought I understood all of these concepts until I watched your tutorials. it's amazing!

  • @fanasisangweni8539
    @fanasisangweni8539 Před 4 lety +1

    Im glad I found your channel man, i swear to god, your videos are awesome, Im only starting to understand ANNs after watching your videos.

  • @clementsiow176
    @clementsiow176 Před 4 lety

    Never have I been so exciting for a new CZcams video, you have earned my respect

  • @rakshitjoshi823
    @rakshitjoshi823 Před měsícem

    High quality animations. Much respect!

  • @horseman3253
    @horseman3253 Před 3 lety

    Wooow, this how all subjects in school should be explained, amazing visualization, very clear!

  • @julianray6802
    @julianray6802 Před 3 lety +1

    Thought I knew what batch size was about.... I was completely wrong....I need to think like a neuron! Thanks again Sentdex for yet another great video!

  • @josephyu2110
    @josephyu2110 Před 6 měsíci

    Wow your video are just amazing, this clarity to explain complex thing is just incredible

  • @brendensong8000
    @brendensong8000 Před 3 lety

    Thank you for the clear explanation! I was completely lost after several videos! you made it so clear!

  • @santoshtamboli
    @santoshtamboli Před 4 lety +1

    This is really awesome. There is no chance for anybody to post negative comments. I like to watch more videos from u on ML and DL

  • @Gazarodd
    @Gazarodd Před 4 lety

    I think this tutorial serie will explode. Atm, it's really clear, you're fantastic

  • @danielbardsen4101
    @danielbardsen4101 Před 4 lety

    Hi Sentex,
    Thanks for doing my engineering/programming career so much more interesting! You really are the best

  • @ChiKettle
    @ChiKettle Před 2 lety

    3 points:
    1. AWESOME videos
    2. Amazing coffee mugs; genuinely thought he was eating a burger in the previous video for a second
    3. He sounds like Nicholas Cage e.g. 5:54

  • @3kelvinhong
    @3kelvinhong Před 3 lety

    This video really helps me a lot! After I learned feedforward and backpropagation, I kind of construct a three layer neural network with mechanisms inside, but it is hard to change the code to four or five layer. With your technique, however, I could change my code into something more dynamic. Thanks!

  • @sharanbabu2001
    @sharanbabu2001 Před 4 lety +1

    Loving the effectiveness! The batch size explanation was amazing!

    • @sentdex
      @sentdex  Před 4 lety +1

      Glad you liked it!!

  • @Mayank25
    @Mayank25 Před 3 lety

    This is the best tutorial ever I watched.. Kudos 👍🙌🙌🙌

  • @ryangao3564
    @ryangao3564 Před 4 lety

    Hey sentdex, such addictive content in your videos. Couldnt wait for the next release any longer. So I just pre-ordered the e-book.

  • @johnyeap7133
    @johnyeap7133 Před rokem

    Made the batch learning benefits really clear, thank you

  • @ayanpanja1956
    @ayanpanja1956 Před 3 lety

    You just made me cry. The way you portray my god!!! its really awesome. So much effort. I still dont understand the people who disliked the video.

  • @raccoon_05
    @raccoon_05 Před rokem

    Thx so much for this series. You're really helping me understand the basic concepts behind this 👍👍👍

  • @chuckf5540
    @chuckf5540 Před 3 lety

    Great explanation and very clear. I look forward to all videos. What a learning process!!

  • @ConnellArt
    @ConnellArt Před 9 měsíci

    Can't wait for the book to arrive! This is a great tutorial so far. Thank you Edward Snowden!

  • @strawhat3891
    @strawhat3891 Před 3 lety

    Totally worth it! I wish I had the talent to make things simple like you.