Video není dostupné.
Omlouváme se.

33. Left and Right Inverses; Pseudoinverse

Sdílet
Vložit
  • čas přidán 14. 08. 2024
  • MIT 18.06 Linear Algebra, Spring 2005
    Instructor: Gilbert Strang
    View the complete course: ocw.mit.edu/18-...
    CZcams Playlist: • MIT 18.06 Linear Algeb...
    33. Left and Right Inverses; Pseudoinverse
    License: Creative Commons BY-NC-SA
    More information at ocw.mit.edu/terms
    More courses at ocw.mit.edu

Komentáře • 122

  • @ozzyfromspace
    @ozzyfromspace Před 4 lety +170

    I've been taking proper notes, doing the readings, and all the things you would expect of a student (short of taking the quizzes and final). I'm surprised at myself for having the will, interest, and sustained performance to get here. I'm tooting my own horn because I think I deserve it. Thanks to Professor Strang, I haven't just learned one of the most beautiful frameworks in mathematics. I have also proven that I can focus and get things done. If you knew me, you'd know that I'm unreliable, highly motivated at first, but that I ultimately lose steam and perform poorly. This class taught me that when I choose to do something to completion, and mean it, good things can happen. Thank you from the bottom of my heart, Professor G. Strang! Seriously, I'll always hold you in high regard. You gave me the MIT experience. You made me really wanna roll up my sleeves and learn deeply, learn how to think, which is impressive because I'm a college dropout who wanted nothing to do with education (well, schooling, you come across that line of reasoning, I'm sure). This course has improved my confidence. I just wanted to let you know, the special touch you placed on every lecture was precisely what I needed to start to see education in a whole new light. I've said it a million times now, but thank you Professor Strang! And of course, many, many thank you's to the MIT OpenCourseWare for making such high-quality, timeless content available to millions like myself for free. You're changing minds.

    • @owen7666
      @owen7666 Před 4 lety +7

      There is a wonderful momentum to this course that few courses can match. And Prof Strang puts such great effort into building a good intuition for the subject that when much more advanced topics appear, they just fall into place. Really, the breadth and depth of this "introductory" course is amazing

    • @Hotheaddragon
      @Hotheaddragon Před 3 lety +3

      You are not alone with that feel.

    • @lzawbrito
      @lzawbrito Před 3 lety +4

      May I ask if you're still using linear algebra after completing the course, and if so, whether you think learning it from Prof. Strang served you well long-term? And congrats. It takes serious dedication to do something like this without the threat of bad grades and all that.

    • @ozzyfromspace
      @ozzyfromspace Před 3 lety +5

      @@lzawbrito thanks for checking in. Yes, I’m still using linear algebra in my own private studies. I’ve since gone on to self-study Sturm-Liouville theorem, the theory of integral transforms, and complex analysis, which are really just infinite-dimensional vector spaces. Just a few minutes ago, I was solving the Poisson equation in a compact space in R2 (I just took a break, came online, and found your comment 😅).
      Physicists always use the so-called fundamental solution (which is really just a radial basis function) to solve Poisson’s equation, motivated by the fact that the laplacian has rotational symmetry in the usual sense. However, I’m doing research to show that there is a more general solution involving non-radial basis functions (which makes sense as something you’d look for if you’ve taken a linear algebra course as fine as Professor Strang’s). I find this line of inquisition interesting because the Maxwell’s Equations in classical physics are really just a form of higher order Poisson equations, and their underlying potential fields are expressed as radially symmetric convolutions. But the two issues are that 1) potentials only work over all space, not in compact geometries (Helmholtz decomposition theorem) and 2) You can’t capture all the expected dynamics if you’re only composing your functions with radial bases. So, basically, the retarded potential solutions of E&M are just a projection of a function onto a radial subspace. Linear algebra is a good way to see this :)
      So I want to figure out the most general structure of solutions to the Maxwell’s equations, and then write a Machine Learning algorithm that can minimize parameters on the equation to predict the mathematically ideal electronic circuit topology given a set of constraints. As you can tell, I’m a little “all over the place”, but I still insist that this was the first class I took that gave me the confidence to do math and stick to it.
      As for your final point, I’ve found that the threat of bad grades generally hurts my understanding, because then I’m learning to pass rather than to deeply understand. My motivation is that I’m working towards a novel technology and it needs a lot of mathematics. I care deeply about said technology (the float circuit - hence the pseudonym 😅), so I’m internally motivated. I dream lucidly about math. I can’t turn it off. I have a deep appreciation for why I’m trying so hard. Best wishes with everything, Lucas!
      Edit -
      one day I’ll start making YT videos documenting my way of understanding things, to help people that want to “go deeper” on concepts. I just don’t feel ready (knowledgeable enough) today, but that’s the plan. Hopefully by this coming January. I’m just doing a measure theory course to tie up some loose ends in understanding for my first course on PDEs.

    • @lzawbrito
      @lzawbrito Před 3 lety +2

      @@ozzyfromspace This is awesome. The applications of linear algebra in machine learning are many, I imagine. I self-studied using Strang's course exactly for physics (QM) and computer science, since those subjects are what I plan to major in. And your research is this cross-section of so many different fields... I subscribed just to see what you come out with.
      It's also funny you mention PDEs, as I'm taking a PDE course next semester!

  • @gamingChinaNo1
    @gamingChinaNo1 Před 4 lety +136

    I can't believe this class would come to the end so soon. Watched from lecture 1 to here

  • @MrLakastro
    @MrLakastro Před 3 lety +42

    "If a matrix takes a vector to zero, there is no way its inverse can bring it back to life!" - Gil Strang

  • @YtEssEss
    @YtEssEss Před 4 lety +62

    Absolutely wonderful series of lectures. Thanks a zillion to Prof Strang and OCW for making this available. These are absolutely priceless!

    • @YtEssEss
      @YtEssEss Před 2 lety

      @aDBo'Ch 1 Thanks for pointing out typo.

  • @naterojas9272
    @naterojas9272 Před 4 lety +30

    Strang: "You know what that matrix is"
    Me: "I do?.... Wait a minute... I do!"
    Classic moments in Linear Algebra with Prof. Strang 👌👌👌

  • @jinzhonggu8276
    @jinzhonggu8276 Před 2 lety +4

    The charm of this course is after all these lectures, you feel so empowered and confident that is willing to give any linear algebra problem a try.

  • @neoblackcyptron
    @neoblackcyptron Před 3 lety +7

    I have huge respect for this lecture series and Professor Strang. This exact syllabus is what we have to learn in 2020 as one of our core courses in CSE MS in AI/Robotics and ML. It is going to take multiple revisions to really internalize all that this lecture series has to offer. Professor, I hope to build some good things and help usher in the AI revolution that is happening. Thank you for sharing your knowledge to us, the next generation to push things a little bit more into the future.

  • @georgesadler7830
    @georgesadler7830 Před 3 lety +4

    This is another fine lecture on inverses. This the first time that I have seen Left, Right and Pseudoinverses in a linear algebra class. I really learned these topics from DR. Strang. These topics are important in linear algebra, signal and systems theory and control engineering.

  • @mazyarmazari3346
    @mazyarmazari3346 Před 3 lety +2

    prof strang is by far the best teacher i've ever known in my life, he's teaching things that is very much complicated and teaches those things like there're super easy. thank to MIT OpenCourseWare for making this lecture, your job is priceless

  • @MaozhuPeng
    @MaozhuPeng Před 4 měsíci +1

    Just finished the whole course. Thank you Professor Gilbert Strang

  • @nota2938
    @nota2938 Před rokem +2

    The bijection between subspace of row space and subspace of col space had been hinted quite nicely in previous lectures and in the textbook by Dr. Strang, e.g. he stressed the fact there lies an identity when we do row reduction. So nice that I figured this out when I walk through the previous materials. But still, he managed to show there's a new (to me) point of view to such bijection: pseudo-inverse works as *the* inverse function of the bijection mapping, and that the pseudo-inverse *also eliminates* some (resp. left) null space, just like what the original is doing. Such symmetry, such beauty.

  • @quirkyquester
    @quirkyquester Před 4 lety +3

    this is the beauty of life, thank you so much Professor Strang and MIT!

  • @onnalinkajornklam5936
    @onnalinkajornklam5936 Před 6 lety +8

    Thank you professor Gilbert! What you did is incredible, this help take me where I want to be in life!

  • @attilakun7850
    @attilakun7850 Před 2 měsíci

    For 8:22, the null-space of A^t A being the same as A's null-space (when A has linearly independent columns) can be proven as follows. Consider the null-space of A^t A first:
    A^t A x = 0 implies
    (x^t A^t) (A x) = 0
    (A x)^t (A x) = 0
    This expression is 0 only if x is in the null-space of A too. But because of A's columns are linearly independent, only 0 is in their null-space.

  • @Zanco
    @Zanco Před 13 lety +3

    that illustration and this intuitive way of looking at pseudo inverse is great! good job!

  • @moumitabiswas3199
    @moumitabiswas3199 Před 3 lety

    Thanks a lot MIT OCW for this free lectures.and i am grateful to MIT for introducing the TEACHER Mr. Gilbert to the world. I wish him happy and healthy life. You are a wonderful teacher. Thanks.

  • @supersnowva6717
    @supersnowva6717 Před rokem

    I am not ready for this to end!

  • @davidhcefx
    @davidhcefx Před 5 lety +4

    That’s a really great course review! There are something new to learn at the same time

  • @erikumble
    @erikumble Před 3 lety

    Thank you Professor Strang for the great course, and MIT for making it accessible for us. This is such a great opportunity for those of us who want to learn more than what is offered at our high schools.

  • @aoiroadelacvg7489
    @aoiroadelacvg7489 Před 5 lety +2

    the best course to learn linear algebra

  • @npadmanabhan6980
    @npadmanabhan6980 Před 3 lety

    Fantastic way to introduce Pseudoinverse. GS is a great teacher. Thanks OCW for making this available

  • @MrSyrian123
    @MrSyrian123 Před 5 lety +1

    Thanks MIT OCW and Prof. Gilbert for your hard work.

  • @user-gw1jx7iy1n
    @user-gw1jx7iy1n Před 2 lety +1

    Thanks professor ! It really really helps understanding 3D computer vision works.

  • @emenikeanigbogu9368
    @emenikeanigbogu9368 Před 4 lety

    It’s been a journey Professor Strang. It’s quite sad that this has come to an end, but what a beautiful way to come full circle.

  • @pankajvadhvani9278
    @pankajvadhvani9278 Před 2 lety

    what a stunning way in which you have given this lecture series, sir.
    You are the best teacher Gilbert Strang Sir. And very very thanks to MIT OCW to give this beautiful lecture series totally free, and to indroduce the world to this amazing professor.

    • @vicevirtuoso123
      @vicevirtuoso123 Před 2 lety

      Bhai samajh me aajayega ? I m Btech 1st yr student

    • @pankajvadhvani9278
      @pankajvadhvani9278 Před 2 lety

      @@vicevirtuoso123 aayega bhai, but there is a deep meaning to every line of sir, and in watching 1 time you can't get everything, but surely u will get a geometrical sense of every concept, which gives u satisfaction

    • @pankajvadhvani9278
      @pankajvadhvani9278 Před 2 lety

      One suggestion more : plz read book of Gilbert Strang sir also, because it will tell that how much sense u got, and will fill the gaps of understanding

    • @vicevirtuoso123
      @vicevirtuoso123 Před 2 lety

      @@pankajvadhvani9278 thanks bhai
      Vese ap kis course ke liye pad rhe the ?

  • @mauriciobarda
    @mauriciobarda Před 4 lety +3

    For all of you who made it to this point in this great course I recommend watching this short serie before taking the final reviews. Full of graphics with some good aha moments. czcams.com/video/fNk_zzaMoSs/video.html

  • @ummwho8279
    @ummwho8279 Před 3 lety

    22:03 "Where in calculus, you really, you know, you're trying to visualize these things, well 2 or 3 dimensions is kind of the *_limit_* ..."
    Professor Gilbert Strang: Come for the fantastic teaching, stay for the fantastic puns.

  • @lucasrodriguesdelima8697

    So happy with this journey!! Thanks Doctor Strange of Linear Algebra!!

  • @solfeinberg437
    @solfeinberg437 Před 4 lety +1

    In addition to shouting out answers I knew, I would also applaud right when he said thanks.

  • @ramonmassoni9657
    @ramonmassoni9657 Před 3 lety

    This class was pure art

  • @adamcai1102
    @adamcai1102 Před 8 měsíci

    Thanks Prof Strang, you helped me a lot

  • @avinavkashyap8802
    @avinavkashyap8802 Před 2 lety

    Literally ,the legend of linear algebra

  • @rinzindorjee31
    @rinzindorjee31 Před 10 lety +10

    may not be important to go university. i love for ur lecture give me enjoyment

  • @Kenji314159
    @Kenji314159 Před 11 lety +12

    Wrong? He's awesome, he has the best explanations! You should see my teachers :o

  • @bingin15
    @bingin15 Před 13 lety

    i love these lectures! they are fantastic. i love the book too. i think it's really well written. overall, i really enjoyed this course

  • @kingsal
    @kingsal Před 6 měsíci

    There shouldn't be the power of minus one and multiple 'T' - it's excluding matter.

    • @kingsal
      @kingsal Před 6 měsíci

      I mean, it's no longer inverses anymore by using this equation.

    • @kingsal
      @kingsal Před 6 měsíci

      '34:12'

  • @mav45678
    @mav45678 Před 5 lety +1

    The camera crew does not understand what is important and which shots to take and later include in the final video... Way too many close-ups. In general, if the Professor is pointing to something on the blackboard and it is not included in the shot, it's a major fail.

  • @BigBen866
    @BigBen866 Před rokem

    “If a matrix takes a vector to zero it’s no way an inverse can bring it back to life.” I love it😇😇😎😎

  • @kehrierg
    @kehrierg Před 3 lety

    the pseudoinverse is denoted A+. Prof. Strang said pseudoinverses wouldn't be on the final exam. Therefore there were no A+'s on the final. :/ ...

  • @weiliangxu796
    @weiliangxu796 Před 3 lety

    I expect that Professor Strang could explain the rank decomposition for calculating the pseudo-inverse. Damn time limit.

  • @samruby82
    @samruby82 Před 12 lety +2

    simply amazing!

  • @ruiruihuang
    @ruiruihuang Před 9 lety +5

    The best

  • @MrKrishtal
    @MrKrishtal Před 12 lety +4

    Even though i'm studing in Europe, it's still interesting, useful and easy to understand, thx for making these leactures available

    • @lucasm4299
      @lucasm4299 Před 6 lety +1

      MrKrishtal
      MIT
      🇺🇸📈🏆

  • @sansha2687
    @sansha2687 Před 4 lety

    19:00, 21:10, 22:50, 31:40

  • @wowlikefun
    @wowlikefun Před 5 lety +1

    Damn, this guy rocks 🎸

  • @AngeloYeo
    @AngeloYeo Před 5 lety +1

    should be the legend

  • @lincyan784
    @lincyan784 Před 3 lety

    It is awesome! Thank you professor!

  • @geethabr374
    @geethabr374 Před 5 lety

    You are a great teacher!

  • @neoneo1503
    @neoneo1503 Před 3 lety

    One to One 24:00

  • @pelemanov
    @pelemanov Před 13 lety

    @Zanco I agree, it's almost magical :-). Or as he puts it: elegant.

  • @avinavkashyap8802
    @avinavkashyap8802 Před 2 lety +1

    The ONE TRUE KING 😍😍😍😍😍😍😍😍😍😍😍😍😍😍😍😍😍😍😍😍😍😍

  • @thejameskan
    @thejameskan Před 12 lety

    some great inforamtion here thanks

  • @yuwuxiong1165
    @yuwuxiong1165 Před 3 lety

    "projectors are trying to be identity, but it's impossible."

  • @simpleboy5100
    @simpleboy5100 Před 5 lety

    Thank you really

  • @yongjieee
    @yongjieee Před 10 lety

    Prof. G. S. is awesome

  • @user-eg9mx1sz4l
    @user-eg9mx1sz4l Před 5 lety

    Insightful!

  • @huangmingya2905
    @huangmingya2905 Před 2 lety

    thankyou!

  • @jayejayeee
    @jayejayeee Před 12 lety

    very interesting video thanks

  • @georgeyu7987
    @georgeyu7987 Před 4 lety

    40min short lecture contain lots juicy stuff, watch two times

  • @shivammalviya1718
    @shivammalviya1718 Před 5 lety

    Thanks sir

  • @thejameskan
    @thejameskan Před 12 lety

    interesting video and very informative

  • @chengweili9516
    @chengweili9516 Před 7 lety

    Awesome!

  • @thegeffc
    @thegeffc Před 12 lety

    very interesting thanks

  • @saurabhshrivastava224
    @saurabhshrivastava224 Před 3 lety +1

    21:28 Well I shouldn't say anything bad about calculus, but I'll. 😂

  • @Qladstone
    @Qladstone Před 8 lety +1

    I never really got the whole picture thing even until now. Does it give any conceptual meaning beyond the definitions embodied by it? Or it is merely a memory aid to remember the definitions? The definitions embodied are:
    For any m by n matrix A - row space and nullspace are orthogonal complements in Rn, column space and left nullspace are orthogonal complements in Rm.
    i.e. does it supply meaning to the 4 subspaces the way a graph would supply meaning to a function? From what I see, no... Or am I missing something?

    • @720SouthCalifornia
      @720SouthCalifornia Před 8 lety +15

      +Quanxiang Loo
      Yes it is like a function, hence the name linear map. When we multiply by a matrix we take combinations of the columns, so the range of the mapping is the column space. If the column space doesn't span Rm, then we can see there is no way to get answers that have any components orthogonal to the column space. We call this 'orthogonal space' the left nullspace.
      Similarly, when we multiply by a matrix A we take the inner product of a vector x and the rows of A. Recall that the dot product of v.w can be thought of as capturing information of how parallel v is to w and their magnitudes (||v||||w||cos(v-w)). Therefore in multiplying Ax = b, only the magnitude of x that is parallel to the rows of A is relevant to determining b. In other words, components of x are either in the rowspace, or in the nullspace. The nullspace doesn't contribute to b.
      Input : Rn = Domain = (Row Space + Null Space)
      Output : Rm = (Range / Column Space) + Left Null Space
      Taking the transpose flips the input and output. This is why for orthogonal matrices, which have orthogonal unit vectors, the inverse is the transpose. They aren't scaling or skewing the coordinate vectors, so we can just transpose to flip the input and output and the mapping between vectors in the row and column spaces remains intact.

    • @jamjayhamz8309
      @jamjayhamz8309 Před 8 lety +3

      +720SouthCalifornia Thank you for this comment! Very informative!

    • @leojanssens1130
      @leojanssens1130 Před 7 lety +2

      great answer.. here here!!

  • @imegatrone
    @imegatrone Před 12 lety +1

    I Really Like The Video Left and Right Inverses; Pseudoinverse From Your

  • @mengmengcui8861
    @mengmengcui8861 Před 8 lety

    A good class! Thanks for all your effect(Thank teacher extra)!

  • @RohitSingh-nm9wd
    @RohitSingh-nm9wd Před 2 lety

    Wow this is the end

  • @alekssandroassisbarbosa3749

    lovely 21:14

    • @Horesmi
      @Horesmi Před 6 lety +1

      I shouldn't say anything bad about calculus.
      *But I will*

  • @solfeinberg437
    @solfeinberg437 Před 4 lety

    Does he want students to shout out answers? Because I'd be talking a lot. I'd have to ask him, if he just meant us to think about it or to say it.

  • @JH-jv1ee
    @JH-jv1ee Před 4 lety +1

    Is this real? He is the legend..

  • @ChessMemer69
    @ChessMemer69 Před 2 lety

    "Calculus sucks" - Strang, 2005 😆

  • @muhammadsarimmehdi
    @muhammadsarimmehdi Před 4 lety

    How can you use right pseudo-inverse to solve Ax = b?

    • @ramkrishna3256
      @ramkrishna3256 Před 4 lety +1

      Ax = b
      here we need to project 'b' on to the column space of A.
      We need left pseudoinverse.
      (A+)Ax= (A+)b
      x‘ = (A+) b
      Where x‘ is the approaximated solution.
      And (AA+) is the projection matrix for the column space of A.
      So I guess in this case we don't use right pseudoinverse

  • @AZZEDDINE2801
    @AZZEDDINE2801 Před 11 lety +1

    Thank you Prof but I would like to find a video about apllied mathematics especially focused on power system thank you

  • @oakschris
    @oakschris Před 8 lety

    What does he mean "A projection matrix is the identity matrix where it can be and everywhere else it's the zero matrix" at 19:20?

    • @antoniolewis1016
      @antoniolewis1016 Před 8 lety +5

      A projection matrix doesn't change any vector in its column space, but it kills any vector in its left null space.
      In that sense, it is the "identity" on the column space, and the "zero" or "killer" on the left null space.

    • @davidlovell729
      @davidlovell729 Před 7 lety +2

      That's not quite the full explanation, because there is a third possibility. The matrix A maps a vector that is in the row space but neither the column space nor the null space onto the column space. Thus, he shouldn't have said "everywhere else" because it's not true.

    • @jagwasenibanerjee1424
      @jagwasenibanerjee1424 Před 4 lety

      @@davidlovell729 Ax will always be in the column space of A, right? Then can you please explain what do you mean by A maps a vector that is in the row space but not in the colomn space?

    • @phogbinh
      @phogbinh Před 3 lety +1

      @@jagwasenibanerjee1424 Antonio was totally correct, and David was right in saying that the professor shouldn't have said "everywhere else".
      Here is my summary (hopefully this is totally correct):
      b is in R^M (A is an MxN matrix).
      If b is in C(A), then the projection matrix acts on b like the identity matrix.
      If b is in N(A'), then the projection matrix acts on b like the zero matrix.
      If b is in neither C(A) nor N(A'), then the projection matrix does its job: project b onto C(A).
      Note: A' stands for A transpose.

    • @woddenhorse
      @woddenhorse Před 2 lety

      @@phogbinh Thanks

  • @jinzhonggu8276
    @jinzhonggu8276 Před 2 lety

    What does it mean when the projection matrix is trying to be the identity matrix?

    • @Robocat754
      @Robocat754 Před rokem

      Only when A is invertible can that projection matrix equals to identity. Since A is invertible, it has n independent columns. What happened when you project a vector into a subspace where the vector is actually inside it? The projection matrix is the identity, isn't it? It's like the least square problem AX=b where b is in the column space.

    • @Robocat754
      @Robocat754 Před rokem

      But elsewhere the projection matrix is zero matrix, how?

    • @Robocat754
      @Robocat754 Před rokem

      The elsewhere professor refers to could be the elsewhere of R^m. Since the projection matrix projects to the column space of A. And Both column space and left null space span the R^m space.
      For future learners who may have doubts on this.😅

  • @hemantyadav1047
    @hemantyadav1047 Před 5 lety

    I think about this in my sleep.............lmao.

  • @shuier525
    @shuier525 Před 3 lety

    147K views with 700 likes, human beings are not very generous about their likes

  • @3andhalfpac
    @3andhalfpac Před 12 lety

    haha i love this guy,

  • @AmanSingh-xk2lv
    @AmanSingh-xk2lv Před 9 lety +1

    which lecture did he speak about full column rank?

    • @mitocw
      @mitocw  Před 9 lety +8

      +Aman Singh Judging from our quick searching, full column ranks are discussed starting in lecture 8.

  • @RrNefriana
    @RrNefriana Před 3 lety

    I am a statistician that watching this video :)))

  • @spandanbasu5653
    @spandanbasu5653 Před 4 lety

    How is A(A^TA)^(-1)A^T a projection matrix?
    Please help.

    • @saintelohim
      @saintelohim Před 4 lety

      Check if the matrix P^2=P

    • @arijitghosh2400
      @arijitghosh2400 Před 4 lety

      You can always consult lecture number 15, the , matrix essentially projects any vector on its coloumn space, which is deemed as the best solution to a problem which has no solution.

  • @kjjonsify
    @kjjonsify Před 12 lety

    wats wrong with that teacher lol