Singular Value Decomposition (SVD): Matrix Approximation
Vložit
- čas přidán 18. 01. 2020
- This video describes how the singular value decomposition (SVD) can be used for matrix approximation.
These lectures follow Chapter 1 from: "Data-Driven Science and Engineering: Machine Learning, Dynamical Systems, and Control" by Brunton and Kutz
Amazon: www.amazon.com/Data-Driven-Sc...
Book Website: databookuw.com
Book PDF: databookuw.com/databook.pdf
Brunton Website: eigensteve.com - Věda a technologie
This channel is so underrated, your explanations and overal video presentation is really good!
Don't know why you think it's underrated...
Everyone who is watching this videos knows how great they are.
This series is by far the best explanation of SVD that I have seen.
SVD was at the very end of my college LinAlg class so I never got a very good understanding of it before the final - this is truly amazing; you say "thank you" at the end of every video but it should be us saying it to you- keep doing your thing! I'm loving it.
The best thing about your lectures is, u do coding implementation along with huge maths.. That makes u different from rest of the traditional instructors. Kudos to you!!!
It's my pleasure
The best explanation of SVD. Your videos are excellent. Thank you very much!
Please keep making these high quality lectures. They are some of the best I have seen on CZcams and that goes a long way because I watch a lot of lectures online.
You have a talent for taking complicated topics and breaking them down into digestible pieces. That's the sign of a good teacher. Thank you.
you explain math in such a way as to not make someone feel stupid, but feel like their taking steps into understanding a larger concept, and the tools they need are the ones we already have, big ups
This was, by far, the most compensable explanation of what the SVD is mathematically and visually. The SVD is an incredible algorithm! Amazing how so little you could keep in order to understand the original system.
I find everything with these courses, even the way board arranged is just great. Many many thanks for this wonderful explanation and all your effort to make it understandable and yet complete.
Superlative production! Lighting, sound, set, rehearsals, material, these videos are among the best productions on CZcams. Even I understood some of it! :-)
What a wonderful way to simplify a complicated topic such as SVD--I wish more people in academia emulated your way of teaching, Mr. Brunton.
I learned this in college but couldn't recall a bit after working in the industry over a year. This explanation not only helped me refresh my memory but also enhanced my understanding as well.
Not an engineer/student, but I'm watching this to get a better understanding of PCA in statistics. I'm going to check the book and research this, but my only complaint (nit-picky) is trying to tell the difference when Steve speaks between "M" and "N" which I know refers to the number of rows or columns of the matrix. But really, this was great and I am thankful that this is something I can study on my own. Much appreciated.
One of the best videos on singular value decomposition. it not only tells the maths but also the intuition. Thanks. !
every time I think it's time to pause and comment this video with "awesome", it surprises me with more informative perspective. great job
This is also the best explanation of SVG I have seen! Thanks for sharing!
One of the best channels I have ever followed, appreciate it so much!
Thank you for presenting us an amazing experience to learn about SVD!
Can't overstate how good this series is...
It doesn't get better than this. I am so thankful to you. I don't know how to repay this help.... And yes, this is a highly underrated channel
The best explanation of SVD i have ever seen !
This video series is so helpful!! Thank you Dr. Brunton!
Thank you very much for those videos , they are very explanatory . Keep up the good work, we need you lessons for our academic improvement.
Gentleman, you have done a very impressive job. I have just started exploring data science and have recently completed my college course in Linear Algebra. This was quite interesting.
superinformative series of SVD
this is by far the best video explaining SVD on youtube
Such a cool concept of decomposition and brilliantly explained here. Big thumbs up!!
Can you do a series on QR decomposition as well? This is so useful!
Just started watching this playlist, excellent explanations and a great way to promote while sharing knowledge; bought your book and can't wait to revisit w/the text!
Awesome, thank you!
These are the best videos out there. Biggest fan Steve, keep it up.
Wonderful explanation, clear and easy to understand. Thank you very much
This one's a brilliant explanation. Simply loved it.
This is answering a lot of my questions on SVD.
Very, very nice explanation and presentation. Thank you!
the best lecture of SVD I have ever seen!
Literally the best svd explained, so meaningful
Thank you so much for posting the course material. I was running through asking ChatGPT some questions about eigenvector/eigenvalues and revisiting some linear algebra when I stumbled upon transitional probability matrices or Markov Matrices, PCA and SVD as was getting back to my Data Science studies. This is very exciting stuff and your presentation is very clear and understandable.
Explanations like this for a dummy like me makes my life so much easier
You are a very very gifted teacher! Thank you for sharing this! :)
You are so talented at teaching, great explanations!
Matrix approximation? More like "Magnificent explanation!" I really can't convey in words how absolutely outstanding *all* of your videos are.
It was pleasure to watch. You should do more educational videos, mr. Brunton.
Amazing lectures, immidiately bought the book, thank you!
The book is great, but relatively terse for someone like me who needs to brush up on his linear algebra. These video lectures are an excellent compliment to the book and really help drive home the concepts.
Gosh, what a class! As mr. Ayush said, this was indeed by far the best SVD explanation I've seen. You've made a such complicated subject way more affordable! I wish you all the best, Steve! Greetings from Brazil!
Thanks so much! That is great to hear!!
Thank you for making the linear algebra less boring and really connected to data science and machine learning, this series is so much more interpretable than what my professor explains
Hey I know it's been nine months but I just came across your comment and was curious. How'd the rest of your class go?
Just in time for the new semester!
Wow, what a wonderful presentation. Congratulations.
This is a gold mine! thanks for your videos
love the video, well explained and aesthetically good.
Beautifully explained.
Good explanation! Many thanks ! how could one manage to get these stuffs explained in such an elegant way.
really really nice explanation!you are really a great teacher!
Great video. Should be required viewing for anybody learning the SVD!
I've read about and even used SVD. But I never really understood it in this way. Thank you!
The explanation is really wow !! Very intuitive ... thank you so much !!
Love his explanation style
really appreciate your efforts. wish u all the best
This is the best explanation ever
Steve is magician of explanation.
Thanks so much!
it covers a lot.Thanks for beautiful teaching!
Thank you for this video series.
Thank you so much for this easy to understand explanation. I was really struggling with the topic and this helped a lot. Thanks again 😊
Glad it was helpful!
13:41 The way I like thinking about is, for example, two vectors in R^3 that are orthogonal are not necessarily orthogonal when projected into R^2, which is essentially what is being done by dropping some of the dimensions. Love the videos though, has me thinking about SVD again.
Thank you Prof. Brunton, excellent explanation! Just come from MIT 18.06.
I like your series also the dark background make my eye feels ease than white background like other channels did
thank u so much sir, very helpful
Excellent, excellent content. Thank you so much!
To the point. Answers all the important questions. I mean you should come to the party knowing some lin alg but great for intermediate level
Excellent explanation. Thank you very much.
best teacher that l have ever met
Great lecture Steve, really enjoyed it! I have a couple of questions: is it better to have longer or shorter time series for the SVD? And is a tall skinny matrix you mentioned speaking about the "economy" SVD only matter for the memory and time savings, or is it generally recommended to have such an input matrix? Thank you in advance and best greetings from Austria) Alexander
Just amazing explanation.
you are at the tip-top i like your explanation
This is a fantastic video!!
Thank you very much. Excellent explanations.
This is everything that I need. Thanks for nice explanation .
You are welcome!
incredicle explanation!
Top rate education, I'm happily learning a lot.
Nicely done. Thank you
Lightbulbs are finally going off when it comes to SVD cant thank you enough!
Watch out Kahn Academy, Steve Brunton is coming for ya! Seriously though, these videos are fantastic :)
Sir. Just thank you for making me be your student in here for free! Great performance, great job!
Extremely clear. Just want to point out a fact which at first did not seem obvious to me: the outer product has rank 1 because all the column are proportional to the first vector of the outer product, hence they are linearly dependent.
I never *really* appreciated SVD until I saw this video. Mind blown!
Mindblowing!
Great video in linear space point of view. One naive question: can we make use of Lanczos algorithm such that we can pick up the most significant eigenvalues for approximation in order to circumvent the full diagonalization of the whole large matrix?
Thank you too so much sir for this explanation its truely amazing
Thanx amaizing as always👍
Thank you so much professor.
I'm glad he made it clear that *outer products* were taking place.
Wow, you are really good teacher.
Great lectures. Thanks so much. Kind of hard to distinguish the m's and n's sometimes. Hoped you'd use some other letters :D
incredible!
for mn ? (For example, what happen if my dataset is composed by 5000 images of 32x32?)
Even better then the previous one
Prof.Steve, Thanks for your explanations. But what is the difference between the POD and SVD for aerodynamics analyeses?
I cant believe I am watching this for free. Thank you!
i had some time to accept 1:22 conclusion, since if i understood you right, we have n vector space, in which our data can be, so it should be okay to use the whole n vectors of U as new basis, unless we want dimensionality reduction and not just matrix decomposition, or i'm just missing something?
Thank you for this great explanation. I just lost you on one point, why is this matrix multiplication equals sig1U1V1T + sig2U2V2T + ... + sigmUmVmT
Can someone explain how does it complete the entire matrix multiplication? I somehow lost in this columns of U and row of V
Thank You Professor. Respects from India
You're welcome :)