### Linear Algebra for Graphics Geeks (SVD-X)

This was previously posted. After realizing it was redundant in light of an earlier post, I retracted it. On third thought, I'm reposting as it has additional thoughts on the subject of the SVD and eigen decomposition.

A = A

USV

USV

AA

Since these are equal and U contains the eigenvectors of AA

Let's state that again.

In the case of the SVD of symmetric matrix A:

U = V = eigenvectors of A

S = eigenvalues of A

USU

Given that realization, the next question crossing my mind was what happens if you calculate the SVD of A

Considering the previous conclusions and A

U = V = eigenvectors of (A

S = eigenvalues of (A

This led me to consider the relationship between the eigenvectors of A

A

(A

Bottom line: squaring the matrix amounts a squaring of the eigenvalues and the eigenvectors remain the same.

Pulling everything together to a conclusion, the SVD of symmetric matrix A results in the following.

U = V = eigenvectors of A.

S = squares of the eigenvalues of A.

A = USV

A

Is there any practical value in this?

Calculating the SVD of a matrix with a great number of rows or a great number of columns can be pretty expensive in terms of memory consumption (although there are more efficient, abbreviated metods highlighted in the Wikipedia page). On the other hand, the dimensionality of A

If the objective is one of, say, calculating eigenvectors of A

I have a final thought that fits with the ideas in this post.

Given a diagonal matrix W and any old matrix A, consider the SVD of W

In this case, V contains the eigenvectors of

(W

A

A

And U contains the eigenvectors of

(W

Then V contains the eigenvectors of

(AW

W

And U contains the eigenvectors of

(AW

(AW

AW

AWA

This is post #14 of Linear Algebra for Graphics Geeks. Here are links to posts #1, #2, #3, #4, #5, #6, #7, #8, #9, #10, #11, #12, #13.

*If a matrix is symmetric matrix, then*A = A

^{T}*Expressed in terms of the SVD*USV

^{T}= (USV^{T})^{T}USV

^{T}= VSU^{T}*In the case of a symmetric matrix*AA

^{T}= A^{T}A = A^{2}Since these are equal and U contains the eigenvectors of AA

^{T}and V contains the eigenvectors of A^{T}A, U and V must be equal and both contain the eigenvectors of A^{2}.Let's state that again.

In the case of the SVD of symmetric matrix A:

U = V = eigenvectors of A

^{2}.S = eigenvalues of A

^{2}.*And since U and V are equal...*USU

^{T}= VSV^{T}Given that realization, the next question crossing my mind was what happens if you calculate the SVD of A

^{T}A?Considering the previous conclusions and A

^{T}A being a symmetric matrix, it follows that:U = V = eigenvectors of (A

^{T}A)^{2}S = eigenvalues of (A

^{T}A)^{2}This led me to consider the relationship between the eigenvectors of A

^{T}A and the eigenvectors of (A^{T}A)^{2}.*Consider the eigen decomposition of A*^{T}AA

^{T}A = USU^{T}*Square both sides...*(A

^{T}A)^{2}= (USU^{T})(USU^{T}) = US^{2}U^{T}Bottom line: squaring the matrix amounts a squaring of the eigenvalues and the eigenvectors remain the same.

Pulling everything together to a conclusion, the SVD of symmetric matrix A results in the following.

U = V = eigenvectors of A.

S = squares of the eigenvalues of A.

*That is, given a symmetric matrix A...*A = USV

^{T}= USU^{T }= VSV^{T}A

^{2}= US^{2}V = US^{2}U = VS^{2}VIs there any practical value in this?

Calculating the SVD of a matrix with a great number of rows or a great number of columns can be pretty expensive in terms of memory consumption (although there are more efficient, abbreviated metods highlighted in the Wikipedia page). On the other hand, the dimensionality of A

^{T}A might be insignificant by comparison.If the objective is one of, say, calculating eigenvectors of A

^{T}A, is it acceptable to do the SVD of A^{T}A instead? Some of the literature I've read says a benefit of SVD is that it's more numerically stable than calculating A^{T}A, but all of the tests I've run in Scilab for cases interesting me seem to work very well. I don't know the answer to this question. Maybe I can find it.*Where's the Shoeshine Boy from Police Squad! when you need him?*I have a final thought that fits with the ideas in this post.

Given a diagonal matrix W and any old matrix A, consider the SVD of W

^{1/2}A.In this case, V contains the eigenvectors of

(W

^{1/2}A)^{T}(W^{1/2}A) =A

^{T}W^{1/2 }W^{1/2}A =A

^{T}WAAnd U contains the eigenvectors of

(W

^{1/2}A)(W^{1/2}A)^{T =}W^{1/2}AA^{T}W^{1/2}*If we swap the order and calculate the SVD of AW*^{1/2}Then V contains the eigenvectors of

(AW

^{1/2})^{T}(AW^{1/2}) =W

^{1/2 }A^{T}A W^{1/2}And U contains the eigenvectors of

(AW

^{1/2})(AW^{1/2})^{T}=(AW

^{1/2})(W^{1/2}A^{T}) =AW

^{1/2}W^{1/2}A^{T}=AWA

^{T}*This is worthy of further consideration in a future post.*This is post #14 of Linear Algebra for Graphics Geeks. Here are links to posts #1, #2, #3, #4, #5, #6, #7, #8, #9, #10, #11, #12, #13.

## 0 Comments:

Post a Comment

Subscribe to Post Comments [Atom]

<< Home