Week 1 - Lecture: History, motivation, and evolution of Deep Learning
Vložit
- čas přidán 23. 07. 2024
- Course website: bit.ly/DLSP20-web
Playlist: bit.ly/pDL-CZcams
Speaker: Yann LeCun
Week 1: bit.ly/DLSP20-01
0:00:00 - Week 1 - Lecture
LECTURE Part A: bit.ly/DLSP20-01-1
We discuss the motivation behind deep learning. We begin with the history and inspiration of deep learning. Then we discuss the history of pattern recognition and introduce gradient descent and its computation by backpropagation. Finally, we discuss the hierarchical representation of the visual cortex.
0:03:37 - Inspiration of Deep Learning and Its History, Supervised Learning
0:24:21 - History of Pattern Recognition and Introduction to Gradient Descent
0:38:56 - Computing Gradients by Backpropagation, Hierarchical Representation of the Visual Cortex
LECTURE Part B: bit.ly/DLSP20-01-2
We first discuss the evolution of CNNs, from Fukushima to LeCun to Alexnet. We then discuss some applications of CNN's, such as image segmentation, autonomous vehicles, and medical image analysis. We discuss the hierarchical nature of deep networks and the attributes of deep networks that make them advantageous. We conclude with a discussion of generating and learning features/representations.
0:49:25 - Evolution of CNNs
1:05:55 - Deep Learning & Feature Extraction
1:19:27 - Learning Representations
Thanks for make this course available for the general public, actions like these are a very valuable contributions to open high class education to more people.
You're welcome. I try my best.
I know you guys probably spent a lot of time on making this course and to share it by making it publicly available and I just want to tell you how much I appreciate that. Thank you so much!
I want to appreciate your efforts and and videos you make.
Yup, it's been taking forever, and I'll try to keep pushing videos out, if new content is crafted.
You're very welcome 😁
Thank you so much for making this available in public, this a huge contribution to the community
I'm glad y'all have been finding them useful 😊
great to see Precious content open to the public, knowledge have no limits
Agreed.
out of the many available courses, this is the one i chose to embark my journey in deep learning once and for all.
big thanks.
You're most welcome! 😇😇😇
Hi Alfredo I really enjoyed this lecture. This is inspiring, thanks for putting it as a free content. Thank you for inspiring us.
Glad you enjoyed it! 🤗
Thank you so much! Not everyone has an opportunity to study at world class college, this kind of online courses open a new door for many people, thank you!
That's great! I can't wait to learn what y'all have been using this knowledge for! 🤩
I've already watched this course almost 3 times, it keeps getting better!
👀👀👀
Thank you for sharing precious lectures! Amazing professor!
There are two of us, but… okay, you're welcome 😉
Hi Alfredo, I came to these lectures by chance as I was searching for a deeper understanding of Variational Auto-encoders. Your lectures on those are fantastic. I ended up spending more time browsing through many other lectures in the series by you and Yann.
Now my neurones are triggered enough beyond their thresholds that I have decided to take this course systematically and formally. Starting today, I intend to deep learn. :-)
Thanks a lot for these lectures and curation.
😇😇😇
We are the lucky generation to have Yann LeCun. Thank you so much for making this wonderful course.
You're welcome 😀😀😀
Thanks a lot for uploading this video on youtube. The explanation about SVM was brilliant.
Also, I am reliving my Courant days via these lectures.
Yay! 🥳🥳🥳
Thanks for making this content free for general public, for whom the direct interaction with the best in field may be far fetched, but still have the zeal to pursue the field. And thanks for answering the questions.
Yann and myself answer every comment here 🙂
Thank you for your generosity. This is one of the best, if not the best, way to promote inclusiveness in the AI community. Much much better than those who only (emphasis on only) complain about the lack of representation in conferences, papers, etc. but never did anything concrete.
Hahahahahahahahahahahahahahahaha! I try to speak with facts 😜
I think your comment deserves a retweet, but it'd be bring me too much 💩💩💩
Hahaha 🤣🤣🤣
Thank you so much for sharing, and make the world greater!
🥰🥰🥰
so grateful for this! thanks NYU, prof LeCun, prof Canziani!
🥳🥳🥳
Thank you for this course! Really appreciate it.
You're very welcome 🤠
This is amazing! Thanks for making this public!
Hehe, you're welcome 😊
Hi Alfredo! Many Many Thanks for doing such an amazing work
So nice of you 😛
Thank you so much for this awesome course, bless you
You're welcome 😊
Dall'Italia, grazie Alfredo, questo corso è bestiale!
Complimenti
Ahah, grazie! Fa piacere che sia digerito pure dal popolo italiano! 😁
Thank you so much for uploading those vidoes.
It's not uploading that's painful… editing took "some" time, haha!
Anyhow, you're welcome!
Great course. Thanks !
You bet!
Thanks alfredo , you are the best
🤗
Thanks for your contributions!!! It helps me a lot!!!
Yay, I'm glad ✌🏻✌🏻✌🏻
Thanks for sharing , great contribution to the public .
You're welcome 😄
thank you so much for making it public
Check out the more updated version of the course also available on my channel!
Thank you Prof Canziani for sharing this video series. As a Master's student in NYU Tandon, I would like to register for this course, but could not even get into the waitlist. With your great job uploading the videos, I can have access to the best ever course in NYU.
You are welcome.
This year there was no waiting list (I did remove that limit) and I did have Tandon students. So, I'm not entirely sure why you couldn't register. 😕
Great to hear that! I'll try again this fall semester.
Nice simple illustration of Deep learning concepts. Thank you very much for sharing to the world
Glad you liked it! 😊😊😊
Thanks a lot for sharing this with world. This will be beneficial for lot of students and Analytics professionals.
😇😇😇
Ciao Alfredo, ringrazio immensamente te e i tuoi colleghi per questo bellissimo corso. Sarà sicuramente la prima fonte a cui penserò quando qualcuno mi chiederà un buon corso sul Deep Learning.
Hi Alfredo, I want to say an infinetely large thanks to you and your colleagues for this incredible course. It will be the primary source that I would suggest whenever anyone ask me for a good course about Deep Learning.
Colleghi, tipo Yann? 😅😅😅
Mi fa piacere che ti piaccia. Grazie per la raccomandazione! 🥰🥰🥰
thank you very much for making the course available so that people from poor countries like me can still get this top-tier education. thank you very much.
Not a problem 😇😇😇
Thank you so much for making these videos public!
#foundingfather
You're welcome! 😊😊😊
Many thanks Alfredo!
You're welcome! 😃
Thank you!
De nada 😊😊😊
Thank you very much for uploading these videos😃
You're most welcome!
Thank you,so much
You’re welcome 😇
I'm so happy to find this
🥳🥳🥳
Excellent Lecture. Thanks.
You’re welcome 😊😊😊
Thank you for making this available to everyone! This first lecture has been immensely interesting!
Check out the 2021 edition. The content is fresher!
@@alfcnz Oh great awesome! Thank you very much! I came here from atcold.github.io/pytorch-Deep-Learning/ may I know if the material is still the same or is there an updated site as well?
@@alfcnz I saw the course website in the new link. It's awesome, thanks again so much!
You're welcome 😇😇😇
Y'all deserve a spot in heaven!
Aww 😇
I think this perfectly counteracts who said I deserve a spot in hell, for being me. Haha! Funny!
Class goals
🥰🥰🥰
thnx for the upload
Sure 😊
That "manifold hypothesis" part is so astounding :o
😅😅😅
This is so awesome! Thanks so much~~
不客氣~~
Thanks for make available for open source!!! 😍😍😍😍😍
My pleasure 😊
Thanks Alfredo for these videos. I would like to thank you personally for putting the materials well structured. This is great.
You're welcome! I finally did accomplish what I came to NYU for. Now I'm done and I can finally head to my next quest 😁
@@alfcnz What would that next quest be, if you don’t mind? 😁
@@labeeb_ibrahim I was thinking to learn Mandarin in Taiwan or Shanghai, so that I can teach to even more people. And while I'm there, perhaps write a book?
@@alfcnz cool. That would be wonderful.
awesome video
Thank you ❤️
instablaster...
👀
Thanks for making this available for the Public. Such amazing work. I will follow each and every lecture even if I know up to week 7 of this course cause who wanna miss high world-class education. The only thing is I have learned the implementation part in TensorFlow. Is it gonna be a problem? Should I study PyTorch and start the course or can I start from now on and pick up the things where it is necessary?
Thank you again for making this publically available.
Highly appreciated.
Most welcome!
Hehe, thanks! 😉
There will be 14 weeks in total. I'm editing and uploading when I get some spare time.
If you already know TensorFlow, following my explanations using PyTorch should be very very straightforward. PyTorch gets away with most of the overhead introduced by its counterpart. There is *no required knowledge* of any DL framework. Python proficiency is all it is needed.
You're welcome 😊
Wow, I have taken several courses on DL, but this is the clearest lecture.
🥰🥰🥰
🥰🥰🥰
Thanks so much
You're very welcome! ✌🏻✌🏻✌🏻
You have to be extremely patient and careful when listening as he is just speaking only, I wish he could use a board for writing.
However, it's worth listening him; his words are the resources of my search on Internet.
All lectures and practica have been transcribed 📝 (in English) and translated to many languages on the course website.
If you prefer consuming textual information 🤓, you can turn on the subtitles and / or check out the transcripts! 🤗
Thank you very much for this course and all materials! p.s. the creator of materials has very good sense of color :)))
😇😇😇
Amazing lecture and great content. Thanks for sharing this awesome content. Also, is there a standard reference textbook to refer for this course material which i can supplement to read alongside
I haven't started writing it. You can find a transcript and translation of this course on the the course website (link in the video description above).
Thank you so much for making the lectures available. Anyway I can access the Jupiter notebooks used while teaching ?
Of course. Everything is available for free.
Just read the video description 😃
Will you update the course this year? I am thinking of starting it now but I am not sure whether to wait for the updated recordings/notebooks or start right away.
Also, thank you very much for putting the whole course online! Here I really need to emphasize "whole". Many courses have only parts available to the public, this one has everything: recordings, GitHub with Jupiter notebooks from seminars/labs, and even a Google drive folder with coursework, so you can actually do them yourself (if I understand it correctly)!
Hi George, I am an the same boat. I chanced upon this course as I was searching for a deeper understanding of variation encoders. After browsing through few lectures, I agree with you that it is comprehensive in theory, application and resources.
I have decided to take more systematically now, starting today and starting with the first lecture. I won’t wait for the updated or Spring 2021 version (if there is going to be one) as this one is as latest as it comes. If there is a revision, will go over that again.
I'm posting updates practica. As my understanding and knowledge grow, my explanations are improving. I'm not entirely sure I'll push the lectures recording. They are better this year, but not that different to justify republishing.
Great video! If you could shed some light or provide a reference to 00:37:04 min wherein the SGD is said to be better than GD because it converges faster for larger training sets and has better generalization over test set... why is that so?
SGD converges faster than GD on larger training sets because you get more gradients for a given amount of computations. GD gives you one gradient per epoch, whereas (mini batch) SGD gives you nb_samples / batch_size gradients (pure SGD batch_size = 1). So, by the time you reached the end of your first epoch you got many many updates for your parameters, hence it allows you to move faster in your optimisation process.
The noise in SGD gets your network to prefer a wider minimum. This is because, all these gradients are moving the point in many directions, and you eventually overflow narrow minima. Wider minima are inherently more stable: the loss chances not much. Moreover, these minima are empirically shown to be equivalent, meaning the network will perform similarly well regardless of which one SGD hits.
Great to learn from the legend, hilarious at times. It would be great if you can make the final project competition open to participate for public.
You mean the results?
No not just results, please allow people to submit their models and compete in a common leader board with NYU students. I know this may be a bit too much to ask for, but just in case.
Ehm… 😅 We need to run each team network to assess the performance on the test set. If we get to automate the process, then I guess we could.
That is a great idea! Probably host a challenge on Kaggle, one for in-class students and a similar one for the general public.
Alright, I think we'll try to host it on Kaggle and open it to the public too!
Good very good
Thank, really thanks! 😃😃😃
Holly molly, that awesome goosebump moment when you are being taught by a Turing Prize winner and the godfather of deep learning himself!
held* 😜😜😜
@@alfcnz Right. "taken" is for the students. 😂Anyway, both of you are awesome teachers. Thanks you so much!
😃😃😃
🙌💯
😎
I didn't get the last part. Why is the manifold not a plain? What does it mean to rotate 360. Thank you!
You need to include hour:minute:second if you want me to be able to understand what is that you're talking about.
is prerequisite course is also available online? if Yes, please provide the link.
Thank You
Of course it is. Read more about it here: twitter.com/alfcnz/status/1314050772968132610
Hi Alfredo,
In order to jump into this course, do we need to have unerstanding of Pytorch ?
No, I teach you everything from scratch.
1:00:20 Have different colour spaces been tried?
Colour space transformations are linear, as far as I know. Nothing that the first convolutional kernel cannot learn by itself, through gradient descent.
Great lectures. Is the PIAZZA accessible by public? I used the code but it seems not working.
Yes, Piazza is for the in-class students.
You can get me to reply to any question here (if related to a specific topic / video) and on Twitter, though.
We have a subreddit now. Check it out! www.reddit.com/r/NYU_DeepLearning/
Hey Alfredo. Hope you're well.
I wanted to ask does this course have an accompanying text or can you recommend any text. (Like a proper book that deals with Deep Learning using PyTorch)
There is a book 'Deep learning with pytorch' you can refer that if you want, it's a good book for reference.
The course website contains the transcriptions and translations of every video.
I haven't yet started writing my book. Maybe next year?
The book is one third done. The first draft should see the light very soon! 😀😀😀
GodFather Teaching :)
Haha 😆
is there anything on natural language processing too ?
We have an entire course on that. Nevertheless, I'll teach you how to train a language model with a recurrent net and how transformers are used to perform neural machine translation.
What prerequisites do you think is there to fully understand concepts in this course??
High school math and physics are sufficient. In my sections I break everything down to that level.
If you still fill puzzled, we reply to each and every question. 🤓
@@alfcnz That's Cool. I have not checked this but, does the course has implementation part as well or just explanation of concepts. I see there are practical sessions, is that what they are for. Thanks!!
Hi Alfredo,
Can I ask you one question?
I am working as a Business Analyst in an IT company, and I have just started learning Python in VS.
Is this the time for me to start learning this course.
*Also, Thanks for uploading the course online😊
You need to go through an introductory machine learning course, first. We recommend David's twitter.com/drosen/status/1019411522576113665
@@alfcnz Thank you so much Alfredo for the guidance.🤗 I will come back to your videos after learning some basics from David's lectures..😊
Yay! 🥳🥳🥳
It is an honor to take a lesson from the living legend. I have a question.
Can I make a korean subtitle for this course? Because it would be so great if I can share this course with korean people who doesn't speak english.
If you allow me, I will make it and provide it to you by CZcams subtitle contribution. with No re-upload. No reward.
Absolutely! 😍🤩❤️
I also have a website, which currently I'm translating into Mandarin. Please, contact me via email (my_surname@nyu.edu).
대단히 감사합니다!
@귀욤뽀작 👼🏼
@@alfcnz when you finish, chinese students would appreciate it!
You can find the Mandarin translation of the lecture notes here: atcold.github.io/pytorch-Deep-Learning/zh/
Also, all videos are available on Bilibili as well: space.bilibili.com/521490937/video
Not sure how to create a playlist yet… 😅
@@alfcnz this is great, cause Mainland Chinese students do not have access to CZcams. Thank you.
Maths prerequisites for this course?
High school math, basic machine learning.
The videos are very good but i am not able to do coding in pytorch. Can someone suggest me any free course on pytorch
facebook resource on pytorch is the best place to start. May be the only place. Also, I would also suggest going for fast.ai after you learnt pytorch.
Fast.ai does not teach PyTorch. It uses a library on top of it, which is not what we explain in this course.
I recommend checking out the practica here, on CZcams, and the notebook on GitHub.
@@alfcnz thanks Alfredo
@@rishabhsharma6069 you're welcome.
Subtitle in spanish pleaseeee!!
Haha, I have to check how's the translation going! 😅
There are volunteers who are translating the entire course in many languages as I'm speaking.
سلام . لطفا این ویدئوها را به صورت زیرنویس فارسی نمایش بدهید
Why is the room so dead lmao those were good jokes
Hahaha 😅 they are all very scared of him! 🤣 He _is_ intimidating, at first! 😅
1:19:21 'They're "extremely" stupid' 😂
And it starts at 1:17:38 😬
He has a french accent
I came here to see if the guy "invented" ELM disliked the video, and it seems he didn't!
Elm, the functional language that compiles to JavaScript?
@@alfcnz Haha, one way to deal with that kind of "extreme" is to sweep aside, I'll give you that!
@@paradoksumsu oh, oh, Extreme Learning Machines! (I had to search on Wikipedia what ELM could have stand for 😅) Maybe he did come along but learnt something new instead? 🤔
Aaaaaaaaaahhhhh.
Bbbbbbeeeeee.
Not the best presentation ! Slides are packed and concepts suddenly pop up. Very dense presentation.
🥺🥺🥺
@@alfcnz I must confess that I was quick to comment. After going through second videos onward, I must say this is a great presentation. One of the few courses that actually builds up theoretical background along with the detailed implementation in pytorch. So my opinion is changed and now I am the biggest fan of the course and its presentation.