Sparsity and Compression: An Overview
Vložit
- čas přidán 23. 08. 2020
- We introduce the mathematical idea behind image compression: Sparsity!
@eigensteve on Twitter
These lectures follow Chapter 3 from:
"Data-Driven Science and Engineering: Machine Learning, Dynamical Systems, and Control" by Brunton and Kutz
Amazon: www.amazon.com/Data-Driven-Sc...
Book Website: databookuw.com
Book PDF: databookuw.com/databook.pdf
Brunton Website: eigensteve.com - Věda a technologie
Thanks Professor Brunton - amazing as ALWAYS!
Again I'm asking questions I didn't even know I had. Phenomenal stuff.
This channel is a goldmine. Thank you for putting in your time and effort into making all this superb content.
Hi Professor Steve Brunton, Thank you so much for introducing the new chapter. Really I am pleased and exciting.
Professor BRUNTON, I think nobody can explain such complicated issues as do you. Thanks a lot
One of my favorite topics!!! Thank you.
You are so impressive lecturer! I watch your videos and lectures to get some knowledge and fun. People listen music while cooking whereas i listen to you while cooking and studying. Very big thanks
Thank you so much for your wonderful lecture and book Professor Brunton!
Impressive energy with your education delivery. You have the ability to make any topic a favourite content for someone.
Thank you , all your Lectures are useful
Very much looking forward to this lecture series. I've purchased the book and I'm going through it now. I'm a PhD student focusing on signal processing and data science. Your book has helped so much with clarifying concepts I'm using in my research. Thank you again.
Steve, you are trully the best, your enthusiasm is contagius. I am learning so much from your book, videos and codes, thank you for this.
thank you so much for these videos mate, I am so thankful to you and people like you that bring these deep concepts to life through engaging videos. love your work!
Very excited for this new lecture series! 😊
Thank you for posting these lectures - the SINDy stuff was really cool, so looking forward to these as well!
I had never heard of this...it is truly fascinating and intriguing! Great job explaining in the videos and book...thanks very much for your hard work.
Looking forward to this lecture series having read the book.Very interesting stuff.
Looking forward to watch this series on this topic. Compressive sensing seems like magic.
Great mirrored video as usual Dr. Brunton!
Thank you very much for the amazing video, the book, everything.
I need this topic for my project, right lecture series at the right time. Thank you
Every day I check your didactic channel for getting new videos
Im absolutely grateful that your content is free.
the knowledge you will not find anywhere else... kudos to Steve Brunton.
Thankyou so much for these wonderful explanations and topics.Your lectures are amazing-ly informative, yet so simple for anyone to understand; I learned so much through your lectures and anxiously waiting for your next lecture
Amazing introduction... Thanks!!
thank you Professor. I am looking forward to following these lessons
Thanks alot professor.... really helpful.
Looking forward for the series
As always...... AMAZING!!
I'm hyped! Will be awesome to learn something I haven't already, or at least remember :)
God bless you, sir! This is something so fundamental. These videos are like medicine to the inquisitive mind.
Thanks from Japan, your talk is always so exciting! And It was a PASMO card!
I love my PASMO card!
Thank you Prof Brunton for teaching me I"m not too old for super interesting and super applicable math!
Your videos are awesome.
Looking forward to following the series. Coming from a Fluid Mechanics background, it would be really interesting to get some ideas on how this technique of sparsity i.e. selecting modes that are really relevant can be incorporated in predicting turbulent flows. This is of interest experimentally as well due to the limitation of certain techniques.
Thanks! And I absolutely agree, there are a ton of cool applications in Fluid mechanics. I'll post some videos about this in the next weeks/months.
Looking forward to the lecture series - leaving a comment to please the algorithm.
Great content as usual! Btw, nice post-processing on the video
Thank you for your lectures. They've made me love linear algebra when I used to dislike it.
thank you so much for your lecture
I wish I had you as a professor or a PI professor Brunton.
Keep up the great work!
Well I'm hyped now.
Can you tell which theme are you using for jupyter?
amazing !
Thank you!
Hi Prof. Brunton, great material. Some of your lecture videos are now private, any specific reason for this? Thank you
Sorry about that -- I'm releasing them on a schedule, and having them private first is the only way my subscribers get an announcement when they are released.
I was about to click off because it isn't part of the fourier series, but I got intrested quickly and stayed :D
oh yes very interesting; out of curiosity, after a first iteration, the info left out could be classified as random noise, or info below the perception level (of the visual cortex); repeating the procedure of compressed sensing at a second iteration upon the sparsely reconstructed image would yield the same sparsity/coefficients?
Awesome
Amazing
Thanks 😊👍☺️😊😊
This is gonna be lit.
Sir, Thanks for lecture, I have few things to ask:
1. From your DMD book, flow past cylinder(Re=100), has vorticity contour 151 snapshot data in mat file(has "VORTALL"), which we further apply DMD/POD.
2. I have PIV raw image datasets (Jets, flow past Cylinder, image pairs upto 1000 for each case).
at this stage i need help
How do i stack the vorticity contour data into matrix X and X" to study DMD.
It would be great help if you can able to explain.
Thanks
Want to work as a post doctoral fellow under your guidance if possible...
Does this mean you're done with LaPlace transform?
Thanks god you exist
How long did it take to you to learn to write mirrored so well, Steve?! 🤔👍
Legend...
I like... I comment... I subscribe.
0:43 PASMO card!
Sir , what is l1 & l2 norms? Why we need them?
L1 norm is the sum of the absolute values of elements and L2 is the Euclidean norm
@@hardrocklobsterroll395 thanks :)
more generally, Ln norm is the n-th root of the sum of each coordinate to the n-th power. It can be extended to infinite size and with some limit magic you can prove that L infinity of a vector is its largest entry
@@alegian7934 thank you :)
@@hardrocklobsterroll395 Thanks!