Linear Algebra for Graphics Geeks (SVD-X)
This was previously posted. After realizing it was redundant in light of an earlier post, I retracted it. On third thought, I'm reposting as it has additional thoughts on the subject of the SVD and eigen decomposition.
If a matrix is symmetric matrix, then
A = AT
Expressed in terms of the SVD
USVT = (USVT)T
USVT = VSUT
In the case of a symmetric matrix
AAT = ATA = A2
Since these are equal and U contains the eigenvectors of AAT and V contains the eigenvectors of ATA, U and V must be equal and both contain the eigenvectors of A2.
Let's state that again.
In the case of the SVD of symmetric matrix A:
U = V = eigenvectors of A2.
S = eigenvalues of A2.
And since U and V are equal...
USUT = VSVT
Given that realization, the next question crossing my mind was what happens if you calculate the SVD of ATA?
Considering the previous conclusions and ATA being a symmetric matrix, it follows that:
U = V = eigenvectors of (ATA)2
S = eigenvalues of (ATA)2
This led me to consider the relationship between the eigenvectors of ATA and the eigenvectors of (ATA)2.
Consider the eigen decomposition of ATA
ATA = USUT
Square both sides...
(ATA)2 = (USUT)(USUT) = US2UT
Bottom line: squaring the matrix amounts a squaring of the eigenvalues and the eigenvectors remain the same.
Pulling everything together to a conclusion, the SVD of symmetric matrix A results in the following.
U = V = eigenvectors of A.
S = squares of the eigenvalues of A.
That is, given a symmetric matrix A...
A = USVT= USUT = VSVT
A2 = US2V = US2U = VS2V
Is there any practical value in this?
Calculating the SVD of a matrix with a great number of rows or a great number of columns can be pretty expensive in terms of memory consumption (although there are more efficient, abbreviated metods highlighted in the Wikipedia page). On the other hand, the dimensionality of ATA might be insignificant by comparison.
If the objective is one of, say, calculating eigenvectors of ATA, is it acceptable to do the SVD of ATA instead? Some of the literature I've read says a benefit of SVD is that it's more numerically stable than calculating ATA, but all of the tests I've run in Scilab for cases interesting me seem to work very well. I don't know the answer to this question. Maybe I can find it.
Where's the Shoeshine Boy from Police Squad! when you need him?
I have a final thought that fits with the ideas in this post.
Given a diagonal matrix W and any old matrix A, consider the SVD of W1/2A.
In this case, V contains the eigenvectors of
(W1/2A)T(W1/2A) =
ATW1/2 W1/2A =
ATWA
And U contains the eigenvectors of
(W1/2A)(W1/2A)T =
W1/2AATW1/2
If we swap the order and calculate the SVD of AW1/2
Then V contains the eigenvectors of
(AW1/2)T(AW1/2) =
W1/2 ATA W1/2
And U contains the eigenvectors of
(AW1/2)(AW1/2)T =
(AW1/2)(W1/2AT) =
AW1/2W1/2AT =
AWAT
This is worthy of further consideration in a future post.
This is post #14 of Linear Algebra for Graphics Geeks. Here are links to posts #1, #2, #3, #4, #5, #6, #7, #8, #9, #10, #11, #12, #13.
If a matrix is symmetric matrix, then
A = AT
Expressed in terms of the SVD
USVT = (USVT)T
USVT = VSUT
In the case of a symmetric matrix
AAT = ATA = A2
Since these are equal and U contains the eigenvectors of AAT and V contains the eigenvectors of ATA, U and V must be equal and both contain the eigenvectors of A2.
Let's state that again.
In the case of the SVD of symmetric matrix A:
U = V = eigenvectors of A2.
S = eigenvalues of A2.
And since U and V are equal...
USUT = VSVT
Given that realization, the next question crossing my mind was what happens if you calculate the SVD of ATA?
Considering the previous conclusions and ATA being a symmetric matrix, it follows that:
U = V = eigenvectors of (ATA)2
S = eigenvalues of (ATA)2
This led me to consider the relationship between the eigenvectors of ATA and the eigenvectors of (ATA)2.
Consider the eigen decomposition of ATA
ATA = USUT
Square both sides...
(ATA)2 = (USUT)(USUT) = US2UT
Bottom line: squaring the matrix amounts a squaring of the eigenvalues and the eigenvectors remain the same.
Pulling everything together to a conclusion, the SVD of symmetric matrix A results in the following.
U = V = eigenvectors of A.
S = squares of the eigenvalues of A.
That is, given a symmetric matrix A...
A = USVT= USUT = VSVT
A2 = US2V = US2U = VS2V
Is there any practical value in this?
Calculating the SVD of a matrix with a great number of rows or a great number of columns can be pretty expensive in terms of memory consumption (although there are more efficient, abbreviated metods highlighted in the Wikipedia page). On the other hand, the dimensionality of ATA might be insignificant by comparison.
If the objective is one of, say, calculating eigenvectors of ATA, is it acceptable to do the SVD of ATA instead? Some of the literature I've read says a benefit of SVD is that it's more numerically stable than calculating ATA, but all of the tests I've run in Scilab for cases interesting me seem to work very well. I don't know the answer to this question. Maybe I can find it.
Where's the Shoeshine Boy from Police Squad! when you need him?
I have a final thought that fits with the ideas in this post.
Given a diagonal matrix W and any old matrix A, consider the SVD of W1/2A.
In this case, V contains the eigenvectors of
(W1/2A)T(W1/2A) =
ATW1/2 W1/2A =
ATWA
And U contains the eigenvectors of
(W1/2A)(W1/2A)T =
W1/2AATW1/2
If we swap the order and calculate the SVD of AW1/2
Then V contains the eigenvectors of
(AW1/2)T(AW1/2) =
W1/2 ATA W1/2
And U contains the eigenvectors of
(AW1/2)(AW1/2)T =
(AW1/2)(W1/2AT) =
AW1/2W1/2AT =
AWAT
This is worthy of further consideration in a future post.
This is post #14 of Linear Algebra for Graphics Geeks. Here are links to posts #1, #2, #3, #4, #5, #6, #7, #8, #9, #10, #11, #12, #13.
0 Comments:
Post a Comment
Subscribe to Post Comments [Atom]
<< Home