2 5 Geometry of Least Squares Regression | Machine Learning
Vložit
- čas přidán 8. 09. 2024
- Thinking geometrically about least squares regression helps a lot.
I We want to minimize ky # Xwk2. Think of the vector y as a point in Rn.
We want to find w in order to get the product Xw close to y.
I If Xj is the jth column of X, then Xw = Pd j=+11 wjXj.
I That is, we weight the columns in X by values in w to approximate y.
I The LS solutions returns w such that Xw is as close to y as possible in
the Euclidean sense (i.e., intuitive “direct-line” distance).
GEOMETRY OF LEAST SQUARES REGRESSION
arg min
w
ky # Xwk2 ) wLS = (XTX)!1XTy.
The columns of X define a d + 1-dimensional
subspace in the higher dimensional Rn.
The closest point in that subspace is the
orthonormal projection of y into the column
space of X.
Right: y 2 R3 and data xi 2 R.
X1 = [1, 1, 1]T and X2 = [x1, x2, x3]T
The approximation is ˆy = XwLS = X(XTX)!1XTy
GEOMETRY OF LEAST SQUARES REGRESSION
(a) yi ⇡ w0 + xiT w for i = 1, . . . , n (b) y ⇡ Xw
There are some key difference between (a) and (b) worth highlighting as you
try to develop the corresponding intuitions.
(a) Can be shown for all n, but only for xi 2 R2 (not counting the added 1).
(b) This corresponds to n = 3 and one-dimensional data: X = " 1 1 1 x x x1 2 3 #
#leastsquare #geometry #linearregression #linear #regression
About Me:-
/ @kumarpython
Find videos about :-
#Data #data #Analysis #analysis #ArtificialIntelligence #ai #AI #DataScience #machinelearning #deeplearning #neuralnetworks #artificialneuralnetwork #ann #convolutionalneuralnetwork #cnn #recurrentneuralnetwork #rnn #longshorttermmemory #lstm #gatedrecurrentunit #gru #computervision #naturalLanguageprocessing #nlp #nltk #spacy #tensorflow #keras #linearregression #linear #logisticregression #regression #knearestneighbour #knn #decisiontree #randomforest #supportvectormachine #svm #clustering #cluster #pca #principlecomponentanalysis #ensemble #sklearn #python