How Forward Propagation in Neural Networks works

Sdílet
Vložit
  • čas přidán 24. 07. 2024
  • Welcome to Deep Learning 101. In this course we will start from the ground and build a solid foundation for advanced deep learning practices. In this first lesson we will take an overall look at deep learning to understand what it means. We will learn the first terminology and the main building blocks of deep learning. This way we can start the following week's lessons confidently.
    Previous lesson: • A Summary of Deep Lear...
    Next lesson: • Backward Propagation i...
    📙 Here is a lesson notes booklet that summarizes everything you learn in this course in diagrams and visualizations. You can get it here 👉 misraturp.gumroad.com/l/fdl
    👩‍💻 You can get access to all the code I develop in this course here: github.com/misraturp/Deep-lea...
    ❓To get the most out of the course, don't forget to answer the end of module questions:
    fishy-dessert-4fc.notion.site...
    👉 You can find the answers here:
    fishy-dessert-4fc.notion.site...
    RESOURCES:
    🏃‍♀️ Data Science Kick-starter mini-course: www.misraturp.com/courses/dat...
    🐼 Pandas cheat sheet: misraturp.gumroad.com/l/pandascs
    📥 Streamlit template (updated in 2023, now for $5): misraturp.gumroad.com/l/stemp
    📝 NNs hyperparameters cheat sheet: www.misraturp.com/nn-hyperpar...
    📙 Fundamentals of Deep Learning in 25 pages: misraturp.gumroad.com/l/fdl
    COURSES:
    👩‍💻 Hands-on Data Science: Complete your first portfolio project: www.misraturp.com/hods
    🌎 Website - misraturp.com/
    🐥 Twitter - / misraturp
  • Věda a technologie

Komentáře • 31

  • @mehmetakyuz1632
    @mehmetakyuz1632 Před měsícem

    This is awesome. Right level of detail balanced with simplification. Great work!

  • @jameelabduljalil25
    @jameelabduljalil25 Před rokem +2

    Thanks for the time and effort you put in these lessons and put them for free.. Really appreciate it..💐

  • @HomeDesign_Austin
    @HomeDesign_Austin Před 4 měsíci +1

    great detail for new learners, great job

  • @NM-vw6xq
    @NM-vw6xq Před rokem +1

    Misra, thank you so much for putting together these videos. They helped me out a lot. Seeing you walk through a simple example made a lot of things click for me vs. just seeing random notation in a textbook that I never fully understood.

  • @prasad4685
    @prasad4685 Před 4 dny

    Thanks ma'am- very useful

  • @hshrestha2811
    @hshrestha2811 Před rokem

    You have presented things in a very simple and comprehensive manner. Thanks!

  • @zainulhassan8557
    @zainulhassan8557 Před 11 měsíci

    Thank You so much Mam'm. Really appreciate your efforts. You explain all the things in an easy way.

  • @alexanderkamke3774
    @alexanderkamke3774 Před rokem

    Thanks a lot! Its very good to understand ...

  • @behradio
    @behradio Před rokem

    Simple and Understandable, Thank you 🙏

  • @dancode0775
    @dancode0775 Před rokem

    clear and precise thank you

  • @himanshupandey9902
    @himanshupandey9902 Před rokem

    Great ....... keep continue.

  • @schorrer17
    @schorrer17 Před 8 měsíci

    Hey thanks for the great video!
    How would the formula be, if there would be a second hidden layer?
    assuming B would be the output of the second layer, W2^T would be the weight matrix of the second layer, b2 its bias and so on
    B=beta{W2^T*[alpha(W1^T*X+b1)]+b2}
    is this correct?
    thanks for any help

  • @joniendog6693
    @joniendog6693 Před rokem

    Memang buat pemula seperti saya belajar ini terasa agak sulit ' biar cepet memahami saya terus mengulang ulang melihat videonya ' terima kasih ' semoga sehat selalu

  • @viktorkovacs9680
    @viktorkovacs9680 Před rokem +1

    Thanks, I understand better now. But are the slides correct at 9:11 and 12:13?

  • @hshrestha2811
    @hshrestha2811 Před rokem

    I think you need to have two columns in the W transpose matrix. They are now presented as a single column with w1w2 as a product form.
    For matrix multiplication, number of columns in W transpose matrix should be equal to the number of rows in X matrix (2).

  • @mohammedzia1015
    @mohammedzia1015 Před rokem

    Please share the slides and course notes for Lesson 2, Module 1.

    • @misraturp
      @misraturp  Před rokem +1

      Thanks for the heads up. They're there now.

  • @lakeguy65616
    @lakeguy65616 Před rokem

    RELU, if the value is less than zero, relu returns 0, if greater than zero, relu returns the value (unchanged).

  • @bkentffichter
    @bkentffichter Před 7 měsíci

    This is all fine, but I have no idea what all of this means as far as learning. I need to have some sort of practical something.

  • @playcode001
    @playcode001 Před rokem

    so i first saw you Assemble Ai, loved your content, thank you so much