QR decomposition

Sdílet
Vložit
  • čas přidán 17. 04. 2019
  • QR Decomposition of a matrix and applications to least squares
    Check out my Orthogonality playlist: • Orthogonal sets
    Subscribe to my channel: / @drpeyam

Komentáře • 110

  • @riccardocapellino7680
    @riccardocapellino7680 Před 4 lety +92

    This guy always leaves a smile on my face

  • @quamos99
    @quamos99 Před 3 lety +32

    You are on my elite teachers list: Sal from Khan academy, The organic chemistry tutor, Professor Dave, and now Dr Peyam. Hats off to all of you for making my university life easier. Massive respect and love for all of you

  • @ARNAKLDO
    @ARNAKLDO Před 4 lety +32

    You sir are a genius. This linear algebra extravaganza was super helpful!

    • @drpeyam
      @drpeyam  Před 4 lety +1

      Glad you like it 😄

  • @dani21534
    @dani21534 Před 3 lety +4

    3 hrs of lecture and i didnt understand a word.
    5 mins of watching this video - and i undersand every thing !!
    thank you.

  • @NayaM0
    @NayaM0 Před 2 lety +1

    Best QR decomposition video I've found. Well explained, straight to the point, easy to understand. Thank you.

  • @vassisn2001
    @vassisn2001 Před 4 lety +4

    Thank you for explaining a simple topic as it should be explained, in a simple way. Great explanation!

  • @OrdinaryXtreme
    @OrdinaryXtreme Před 4 lety +15

    This is the funnest math tutorial video I've ever seen. You made the process seem so streamlined and easy, thank you!

  • @abnerandreymartinezzamudio3366

    Im at the point of the semester where I need to take whatever this guy took

  • @springnuance7048
    @springnuance7048 Před 3 lety +2

    the radiance of his positivity in his teachings make me love linear algebra XD

  • @peytonv.6614
    @peytonv.6614 Před rokem

    I have been struggling with this topic, you explained this so well. Thank you!

  • @dengdengkenya
    @dengdengkenya Před 4 lety +1

    The best explanation given on this topic!

  • @stekim
    @stekim Před 4 lety

    thank dr peyam! I really liked the extension to least squares in the second half of the video.

  • @judesharp6472
    @judesharp6472 Před 3 lety

    man how did i not find you before, you're litteraly going to make me pass numerical analysis

  • @Ajay-xd7zq
    @Ajay-xd7zq Před 3 lety +1

    Thanks so much for explanation, It was clear and concise and to the point.

  • @GangerPlayz
    @GangerPlayz Před 4 lety +4

    This was expained so well that i understood despite i talk spanish and dont know even a little of english

  • @epiccheeseburgercat
    @epiccheeseburgercat Před 4 lety +1

    Thank you so much. This really helped my understanding of qt decomposition

  • @TheInuyashaGuy
    @TheInuyashaGuy Před 2 lety +1

    This tutorial is amazing, thank you

  • @elonmusk7142
    @elonmusk7142 Před 3 lety

    That's awesome. I don't understand matrix and I don't understand english but with your funny explinations, I understand everything. Thank you so much.

  • @jxw7196
    @jxw7196 Před 3 lety

    What a living legend... Amazing Peyam

  • @zonglehuang4147
    @zonglehuang4147 Před rokem

    Thanks a lot for your clear explanation!

  • @carlosalbertodelgadoelizon4457

    Thanks a lot!! The example was very illustrating!

  • @Elisko245
    @Elisko245 Před 3 lety

    Thanks you for an amazing explanation!

  • @joaomorais6101
    @joaomorais6101 Před 4 lety

    Perfect explanation. Thx professor

  • @kyliemun2681
    @kyliemun2681 Před 3 lety +1

    God Bless you, man!

  • @059812
    @059812 Před 3 lety

    Great explanation, thanks!

  • @marce3893
    @marce3893 Před 5 lety

    thanks for your work.

  • @Mr.HugsHoney
    @Mr.HugsHoney Před 2 lety

    WOW THANK YOU DR. YOU ARE THE BEST

  • @joaofernandes8051
    @joaofernandes8051 Před 5 lety +38

    And I thought it was something to do with QR codes :P

  • @Bombelus
    @Bombelus Před 5 lety +5

    You're my hero!

    • @drpeyam
      @drpeyam  Před 5 lety +4

      I’m Captain Peyamerica! 🙂

  • @user-ug2lu5rl9f
    @user-ug2lu5rl9f Před rokem

    Thanks alot!!!!!!!!!!!!!!!!!!!!! Prepare to hand in my homework set~~~~~~~

  • @fatty3910
    @fatty3910 Před 6 měsíci +1

    omg you're so cute i can watch you teach all day

  • @abdelrhmandameen2215
    @abdelrhmandameen2215 Před 3 lety

    absolutely fantastic

  • @IlMarinz
    @IlMarinz Před 5 lety +5

    I've Always seen matrix decompositions (QR, LU) done with square matrix, it's a bit strange for me to see them in the nonsquare world. Also, I was told that Q must be Unitarian and Hermitian (I guess for IR orthogonal and symmetric would be fine) hence making QR only be possible in square matrices by definition. I wonder how much of a lie resides in what I just told

  • @virgenalosveinte5915
    @virgenalosveinte5915 Před 11 měsíci

    thank you great video

  • @arwatawfiq4088
    @arwatawfiq4088 Před rokem

    Very useful session thank you so much ❤

  • @Matrioshka_Brain
    @Matrioshka_Brain Před 2 lety

    This was really good :)

  • @FunctionalIntegral
    @FunctionalIntegral Před 5 lety +5

    And the next decomposition should be the very important SV decomposition which one typically uses in the matrix product state formalism. :D

  • @eduardamachado7687
    @eduardamachado7687 Před 4 lety

    Foi muito útil! Thank you so much!

  • @dariusfoodeei8181
    @dariusfoodeei8181 Před 2 lety

    I love this man

  • @gulrukhsor_akhmadjanova
    @gulrukhsor_akhmadjanova Před měsícem

    omg, thank you a lot for your priceless knowlage

  • @samytee3282
    @samytee3282 Před 2 lety

    Wow best explanation and nice style of teaching. Very precise and easy to understand

    • @samytee3282
      @samytee3282 Před 2 lety

      I wanted SVD and two grid method too :)

  • @MrRyanroberson1
    @MrRyanroberson1 Před 5 lety +1

    I wonder... Would a left-sided RQ decomposition ever be useful? And how easy is it to generate compared to QR?

  • @harshala5385
    @harshala5385 Před 7 měsíci

    Thank you sir.

  • @enamanjgo8930
    @enamanjgo8930 Před 2 měsíci

    u are like the bob ross of math :) thank you

  • @humanwang9854
    @humanwang9854 Před rokem

    Thank you sir👍

  • @kevinshao9148
    @kevinshao9148 Před 2 lety

    thanks for the video! but one question, even if R' is not invertible (meaning A has linear dependency) , there still solution for LS right? just lose one dimension. No?

  • @sandorszabo2470
    @sandorszabo2470 Před 5 lety

    A nice lin alg video again. I hope my students will also like it 😊

  • @a.s.6898
    @a.s.6898 Před 2 lety

    crazy good, thank you

  • @kyc3722
    @kyc3722 Před 4 lety

    very very clear

  • @yapzhilin9857
    @yapzhilin9857 Před 3 lety

    THANK YOU SO SO MUCH

  • @RiVaLBrite
    @RiVaLBrite Před 4 lety +3

    WOW this was explained really well, I wish my professor could teach like this :(

  • @ReversePrimeOFFICIAL
    @ReversePrimeOFFICIAL Před rokem

    Do you have videos on SVD? Thanks for this video

  • @ankitb3954
    @ankitb3954 Před rokem

    The watch steals the show

  • @rampage14x13
    @rampage14x13 Před 4 lety

    Thanks this video series has really been helpful! Also, just wondering if you would be willing to share but I was wondering what watch you're wearing? I think it looks great

    • @drpeyam
      @drpeyam  Před 4 lety +1

      It’s an Invicta watch, you can get it on amazon

    • @rampage14x13
      @rampage14x13 Před 4 lety

      @@drpeyam Thanks for letting me know!

  • @JohnDoe-sc4zf
    @JohnDoe-sc4zf Před 3 lety +1

    Hey, thanks for the amazing video. I just have one quick doubt. As you said Q is orthonormal matrix, then when I compute Q*Q' it does not give I. Please enlighten me, or am I misunderstanding something?

    • @drpeyam
      @drpeyam  Před 3 lety +1

      It doesn’t have to. For nonsquare orthogonal matrices we don’t always have Q Q’ = I, that’s only true for square matrices

  • @syedmuhammadabdullah2920

    Thankyou for being successful in successfully wasting my time

    • @drpeyam
      @drpeyam  Před 15 dny

      Awww you’re welcome!!

  • @aminfadaei4056
    @aminfadaei4056 Před 4 lety

    good job peyam

  • @holyshit922
    @holyshit922 Před 3 lety

    Gram Schmidt is good for paper and pencil calculations but i heard that it is numerically unstable and I should avoid it when I write program for QR decomposition
    Householder reflections or Givens rotations are better choice for those who want to write a program
    Silly , or maybe not in Householder reflections we need transpose of matrix to get Q and square matrices are easy to transpose in place
    Transpose of rectangular matrix also can be done in place but it is not so easy

  • @kolinnhtooaung2408
    @kolinnhtooaung2408 Před 3 lety

    Thanks! Sir

  • @rasikajayathilaka3516
    @rasikajayathilaka3516 Před 4 lety

    Great!

  • @mohammadburhanmanzoor3806

    How about its importance in finding evalues?

  • @sabbaramine2719
    @sabbaramine2719 Před 3 lety

    thx bro !!

  • @BloemkoolGaming
    @BloemkoolGaming Před 5 lety +2

    Your enthousiasm is amazing, the only thing that triggers me is that you put your line of your Q on the wrong side !

    • @chiragmittal3313
      @chiragmittal3313 Před 4 lety +1

      Enthusiasm increases by exp(2) when Watching it on 2x

  • @huongtran7922
    @huongtran7922 Před 3 lety

    What is the application of QR decomposition?

  • @7amood40
    @7amood40 Před 5 lety

    Could you do QZ decomposition.

  • @DSKim-ej5uf
    @DSKim-ej5uf Před 4 lety

    Good video! I looked very good. What do you think of Jimmy Hendrix? If you like my guitar and harmonica you will be happy.

  • @vijayank1170
    @vijayank1170 Před 5 lety

    Dr Peyam.. can you please explain what LU decomposition is.. I kind of noticed in my textbook... but no idea what it is..

    • @drpeyam
      @drpeyam  Před 5 lety +2

      There’s a video on that

    • @vijayank1170
      @vijayank1170 Před 5 lety

      @@drpeyam Ok thanks... I'll check it out

  • @ninisky2706
    @ninisky2706 Před 2 lety

    3:16 what does he mean here? what is rescaling a vector ? just getting rid of the denominator ?

    • @drpeyam
      @drpeyam  Před 2 lety

      Multiplying by a constant, here so that the components are integers

  • @RenegadeFury
    @RenegadeFury Před 3 lety

    I thought u2 hat was perpendicular to V1, but apparently that's what v2 is?
    Or is u2 hat supposed to be parallel to V1?

    • @drpeyam
      @drpeyam  Před 3 lety

      No, u2-u2hat is perpendicular to v1

  • @phantom8673
    @phantom8673 Před rokem

    in W1 , where did you get 1/3 ? and what is W1? you said its the lenght of vector[2 2 1] it should be 3 . but why its 1/3?? i dont get it

    • @drpeyam
      @drpeyam  Před rokem

      You divide by the length of the vector to get a unit vector

    • @phantom8673
      @phantom8673 Před rokem

      @@drpeyam I just understand that numerator 1 is part of the formula right ? that was my question . 😄

  • @mzeeshanzafar28
    @mzeeshanzafar28 Před 19 dny

    Dr Ariya from Krish

  • @ademalliche878
    @ademalliche878 Před 4 lety +1

    pleaseee in 03MIN.12 why the first resultat is 15/9 !!!

  • @deniztunayildiz4518
    @deniztunayildiz4518 Před 6 měsíci

    08: 10

  • @1.4142
    @1.4142 Před 5 měsíci

    IR=V fin.

  • @3ia18_prasetyaharkrisnowo7

    Do 100 integral challenge!!!

    • @drpeyam
      @drpeyam  Před 5 lety

      There’s already a 100 T/F challenge

  • @counterko
    @counterko Před 2 lety

    6:50 known Q find R

  • @Handelsbilanzdefizit
    @Handelsbilanzdefizit Před 5 lety

    What, if you define a kind of Matrix-Product with the compositon of the elements: A = { {a11(x), a12(x)},{a12(x), a22(x)} } and B = { {b11(x), b12(x)},{b12(x), b22(x)} }
    So that, A°B = { { a11(b11(x)) + a21(b12(x)), a12(b11(x)) + a22(b12(x)) }, { a11(b21(x)) + a21(b22(x)), a11(b21(x)) + a21(b22(x)) } }
    Linear Algebra is boring, because I never understood it well. Make more videos about this crazy fractional calculus stuff!
    Something like this: (d/dx)^f(x) x = f'(x), where f(x) is the order of the derivative.
    Or functions that transform other functios to their derivatives: g1(f(x)) = f'(x), g2(f(x))= f"(x), ... where gn depends of (d/dx)^n f.
    Then you could "maybe" generalize derivatives by these functions --> g0.3(f) = (d/dx)^0.3 f

  • @Rundas69420
    @Rundas69420 Před 5 lety +1

    Orthogonal Matrices are a bit strange anyway. I will never understand why you call a Matrix with orthoNORMAL column-vectors orthoGONAL and not orthonormal.
    Most of the time I love maths, but sometimes I hate it xD.

    • @sandorszabo2470
      @sandorszabo2470 Před 5 lety +1

      I agree with you. Luckily, there is a book where authors say orthonormal matrix. In my lectures I also say this.
      And there are other cases where I use the corresponding correct name. Similar, more terrible thing is 😄 "this infinite series is convergent". Math is nice, but the language created by persons is not necessarily correct.

    • @Rundas69420
      @Rundas69420 Před 5 lety +1

      @@sandorszabo2470 Finally someone understands me :D

    • @drpeyam
      @drpeyam  Před 5 lety

      I totally agree

    • @HilbertXVI
      @HilbertXVI Před 5 lety

      Eh useless semantics

  • @junior-ee5dm
    @junior-ee5dm Před rokem

    t amo gringo, entendi como el putas no mk lo amo me ayudaste a estudiar para el parcial no nea feliz

  • @shokan7178
    @shokan7178 Před 5 lety +1

    First uwu!

  • @deepgems2246
    @deepgems2246 Před 3 lety

    Your V's look exactly like your U's

  • @typo691
    @typo691 Před 5 lety

    Do you put the slash in the wrong place in "Q" just annoy viewers like me!?

    • @drpeyam
      @drpeyam  Před 5 lety

      No, that’s just the way I’m used to writing it

    • @typo691
      @typo691 Před 5 lety

      @@drpeyam fair enough

    • @Sup3rlum
      @Sup3rlum Před 5 lety

      @@typo691 he is left handed so its easier to swipe down and away

  • @jiongt2677
    @jiongt2677 Před 2 lety

    「動画の音が良くない」、

  • @esutana
    @esutana Před 9 měsíci

    The math people call this Q is semi-orthogonal matrix. They define that orthogonal matrix must be square.
    en.wikipedia.org/wiki/Semi-orthogonal_matrix