Linear Algebra - Lecture 32 - Dimension, Rank, and Nullity
Vložit
- čas přidán 13. 07. 2024
- In this video, I define the dimension of a subspace. I also prove the fact that any two bases of a subspace must have the same number of vectors, which guarantees that dimension is well-defined. Finally, I define the rank and nullity of a matrix, and explain the Rank-Nullity Theorem.
You are the best teacher in the world
I would love to have this IN PDF
Great job explaining this stuff.
Great stuff
Excellent 👍
GREAT! Thank You!
Thanks 😘 love U from Philippines
Create a playlist for this series
Wouldnt the basis for the col(A) be the columns associated with the pivot columns? Now the rows? So it would be [-1 2 3 4], [-2 4 1 2]. Idk man my final is in an hour ._.
why isn't there a video on change of basis.
I think the row reduction you did is wrong, or let's just say i don't understand why yours is that way
Hi professor James, all previous videos leading up to this one showed the linear dependence relationship using the x1V1 + x2V2 +...XiVi = 0 equation, where each V term is a vector in the subspace. However, in this video you are showing linear dependence relationship using the "coordinates" of the vectors in the subspace, not the vectors themselves. I don't remember seeing any proof that the linear dependence relationship can be proven using the coordinates instead of the vectors, but here's my reasoning trying to understand it, please let me know if it's correct.
Using vector equation: x1V1 + x2V2 +...XiVi = 0, can be rewritten as:
x1BC1 + X2BC2 + .... +XiBCi = 0, where B is the matrix consists of the basis vectors, C1....Ci are the coordinates.
=> B(x1C1 + x2C2 + ..... + xiCi) = 0
=> Given B is the matrix, then (x1C1 + x2C2 + .... + xiCi) must be = 0
I think that what's confusing you is that we're showing that {v_1, v_2, ..., v_q} is linearly dependent in an indirect way, rather than using the definition of linear dependence. We instead show that [v_1]_B, [v_2]_B, ... [v_q]_B are linearly dependent (by reasoning about numbers of rows and columns). Then, we rely on the correspondence between x and [x]_B (established in Lecture 31) to say that the same dependence relation that exists among [v_1]_B, [v_2]_B, ... [v_q]_B must also exist among v_1, v_2, ..., v_q.
There's a linear transformation T lurking behind the scenes that represents the correspondence between x and [x]_B. What we're really doing is applying T^-1 to the dependence relation among [v_1]_B, [v_2]_B, ... [v_q]_B to get the dependence relation among v_1, v_2, ..., v_q.
I hope this helps!
@@HamblinMath Thank you professor James, I fully got it.