X'X inverse X' and SVD
In the process of knowledge gentrification, I continue to meander my way back and forth through issues of linear algebra...
This is post #6 in a series discussing the expression (X'X)-1X'Y used in linear regression. Previous posts are here: #1, #2, #3, #4, #5.
Most of the literature I've read on least squares discusses the orthogonal projection of Y onto the column space of X, but something I haven't seen mentioned much is that (X'X)-1X' is the Moore-Penrose pseudoinverse of X.
X+ = (X'X)-1X'
Now's a good time to bring the SVD back into the picture. (Actually, now that I've made it this far, I think I need to go back and amend an old post a little bit.)
Given the SVD of X as USV', consider (X'X):
(X'X) = (VS'U')(USV') = VS'U'USV' = VS'SV'
And:
(X'X')-1X' = (VS'SV')-1(VSU') =
(VS-1S'-1V') (VSU') =
VS-1S'-1V'VSU' =
VS-1S'-1SU' =
V(S'S)-1SU' =
And since (S'S)-1S = S+:
VS+U' = (X'X)-1X'
This shows the equivalence between the SVD pseudoinverse and (X'X)-1X' and why the SVD pseudoinverse gives the least squares solution. (I think this applies when the number of columns is less than the number of rows. Need a reference for that.)
(Note: V and U are orthogonal matrices, so their transposes and inverses are one and the same. S is a diagonal matrix, so it's equal to its own transpose. The transpose of a product is equal to the reverse product of individual transposes. Likewise, the inverse of a product is equal to the inverse of the individual products.)
0 Comments:
Post a Comment
Subscribe to Post Comments [Atom]
<< Home