Linear Algebra - Lecture 32 - Dimension, Rank, and Nullity

Sdílet
Vložit
  • čas přidán 13. 07. 2024
  • In this video, I define the dimension of a subspace. I also prove the fact that any two bases of a subspace must have the same number of vectors, which guarantees that dimension is well-defined. Finally, I define the rank and nullity of a matrix, and explain the Rank-Nullity Theorem.

Komentáře • 14

  • @b.cmagwaza6365
    @b.cmagwaza6365 Před 2 lety +3

    You are the best teacher in the world

  • @johnlocke211
    @johnlocke211 Před 3 lety +14

    I would love to have this IN PDF

  • @mikesgarage18
    @mikesgarage18 Před 2 lety +3

    Great job explaining this stuff.

  • @mphatsomavutokawonga8652
    @mphatsomavutokawonga8652 Před měsícem

    Great stuff

  • @rahulrathod6770
    @rahulrathod6770 Před 3 lety +2

    Excellent 👍

  • @gulshanmustafayeva1707
    @gulshanmustafayeva1707 Před 2 lety +1

    GREAT! Thank You!

  • @kensienc.handugan8688
    @kensienc.handugan8688 Před 3 lety +1

    Thanks 😘 love U from Philippines

  • @gradientO
    @gradientO Před 3 lety +4

    Create a playlist for this series

  • @galben100
    @galben100 Před rokem +1

    Wouldnt the basis for the col(A) be the columns associated with the pivot columns? Now the rows? So it would be [-1 2 3 4], [-2 4 1 2]. Idk man my final is in an hour ._.

  • @aditya3984
    @aditya3984 Před rokem +1

    why isn't there a video on change of basis.

  • @fuadhassan9592
    @fuadhassan9592 Před 4 měsíci

    I think the row reduction you did is wrong, or let's just say i don't understand why yours is that way

  • @weisanpang7173
    @weisanpang7173 Před rokem

    Hi professor James, all previous videos leading up to this one showed the linear dependence relationship using the x1V1 + x2V2 +...XiVi = 0 equation, where each V term is a vector in the subspace. However, in this video you are showing linear dependence relationship using the "coordinates" of the vectors in the subspace, not the vectors themselves. I don't remember seeing any proof that the linear dependence relationship can be proven using the coordinates instead of the vectors, but here's my reasoning trying to understand it, please let me know if it's correct.
    Using vector equation: x1V1 + x2V2 +...XiVi = 0, can be rewritten as:
    x1BC1 + X2BC2 + .... +XiBCi = 0, where B is the matrix consists of the basis vectors, C1....Ci are the coordinates.
    => B(x1C1 + x2C2 + ..... + xiCi) = 0
    => Given B is the matrix, then (x1C1 + x2C2 + .... + xiCi) must be = 0

    • @HamblinMath
      @HamblinMath  Před rokem

      I think that what's confusing you is that we're showing that {v_1, v_2, ..., v_q} is linearly dependent in an indirect way, rather than using the definition of linear dependence. We instead show that [v_1]_B, [v_2]_B, ... [v_q]_B are linearly dependent (by reasoning about numbers of rows and columns). Then, we rely on the correspondence between x and [x]_B (established in Lecture 31) to say that the same dependence relation that exists among [v_1]_B, [v_2]_B, ... [v_q]_B must also exist among v_1, v_2, ..., v_q.
      There's a linear transformation T lurking behind the scenes that represents the correspondence between x and [x]_B. What we're really doing is applying T^-1 to the dependence relation among [v_1]_B, [v_2]_B, ... [v_q]_B to get the dependence relation among v_1, v_2, ..., v_q.
      I hope this helps!

    • @weisanpang7173
      @weisanpang7173 Před rokem

      @@HamblinMath Thank you professor James, I fully got it.