How Forward Propagation in Neural Networks works
Vložit
- čas přidán 24. 07. 2024
- Welcome to Deep Learning 101. In this course we will start from the ground and build a solid foundation for advanced deep learning practices. In this first lesson we will take an overall look at deep learning to understand what it means. We will learn the first terminology and the main building blocks of deep learning. This way we can start the following week's lessons confidently.
Previous lesson: • A Summary of Deep Lear...
Next lesson: • Backward Propagation i...
📙 Here is a lesson notes booklet that summarizes everything you learn in this course in diagrams and visualizations. You can get it here 👉 misraturp.gumroad.com/l/fdl
👩💻 You can get access to all the code I develop in this course here: github.com/misraturp/Deep-lea...
❓To get the most out of the course, don't forget to answer the end of module questions:
fishy-dessert-4fc.notion.site...
👉 You can find the answers here:
fishy-dessert-4fc.notion.site...
RESOURCES:
🏃♀️ Data Science Kick-starter mini-course: www.misraturp.com/courses/dat...
🐼 Pandas cheat sheet: misraturp.gumroad.com/l/pandascs
📥 Streamlit template (updated in 2023, now for $5): misraturp.gumroad.com/l/stemp
📝 NNs hyperparameters cheat sheet: www.misraturp.com/nn-hyperpar...
📙 Fundamentals of Deep Learning in 25 pages: misraturp.gumroad.com/l/fdl
COURSES:
👩💻 Hands-on Data Science: Complete your first portfolio project: www.misraturp.com/hods
🌎 Website - misraturp.com/
🐥 Twitter - / misraturp - Věda a technologie
This is awesome. Right level of detail balanced with simplification. Great work!
Thanks for the time and effort you put in these lessons and put them for free.. Really appreciate it..💐
You're very welcome Jameel!
great detail for new learners, great job
Misra, thank you so much for putting together these videos. They helped me out a lot. Seeing you walk through a simple example made a lot of things click for me vs. just seeing random notation in a textbook that I never fully understood.
Glad it was helpful!
Thanks ma'am- very useful
You have presented things in a very simple and comprehensive manner. Thanks!
You're very welcome :)
Thank You so much Mam'm. Really appreciate your efforts. You explain all the things in an easy way.
Thanks a lot! Its very good to understand ...
Simple and Understandable, Thank you 🙏
You're welcome :)
clear and precise thank you
Glad it was helpful!
Great ....... keep continue.
:)
Hey thanks for the great video!
How would the formula be, if there would be a second hidden layer?
assuming B would be the output of the second layer, W2^T would be the weight matrix of the second layer, b2 its bias and so on
B=beta{W2^T*[alpha(W1^T*X+b1)]+b2}
is this correct?
thanks for any help
Memang buat pemula seperti saya belajar ini terasa agak sulit ' biar cepet memahami saya terus mengulang ulang melihat videonya ' terima kasih ' semoga sehat selalu
Thanks, I understand better now. But are the slides correct at 9:11 and 12:13?
Yep
I think you need to have two columns in the W transpose matrix. They are now presented as a single column with w1w2 as a product form.
For matrix multiplication, number of columns in W transpose matrix should be equal to the number of rows in X matrix (2).
Disagree
Please share the slides and course notes for Lesson 2, Module 1.
Thanks for the heads up. They're there now.
RELU, if the value is less than zero, relu returns 0, if greater than zero, relu returns the value (unchanged).
Well?
@@bay-bicerdover well what?
This is all fine, but I have no idea what all of this means as far as learning. I need to have some sort of practical something.
so i first saw you Assemble Ai, loved your content, thank you so much
Glad you enjoyed!