This time I'm going to discuss eigenvectors, eigenvalues and ways in which they are related to the SVD.
For all the complaints levied against Wikipedia, I am often pleased by how clearly and succinctly many subjects are explained:
"An eigenvector of a transformation[1] is a non-null vector whose direction is unchanged by that transformation. The factor by which the magnitude is scaled is called the eigenvalue of that vector"
Let's say you have a matrix A and a vector V and you transform V with A. If A has no effect on the direction of V and all that A does is make V longer or shorter (or the same), then V is an eigenvector of A:
A*V = scale*V
The value of "scale" is the eigenvalue for the eigenvector V. Eigenvalues are usually identified with the symbol lamba.
Take our matrix that scales X by 3 and Y by 2:
[3 0]
[0 2]
The eigenvectors of this matrix are [1 0]' and [0 1]'. The eigenvalues are 3 and 2.
[3 0] * [1] = [3] = 3 * [1]
[0 2] . [0] . [0] . . . [0]
[3 0] * [0] = [0] = 2 * [0]
[0 2] . [1] . [2] . . . [1]
Transformation of the unit vector on the x axis [1 0]' results in [3 0]', a 3X scaling of its length. Likewise, transformation of the unit vector on the y axis [0 1]' results in [0 2]', a 2X scaling of its length.
The eigen decomposition takes the following form:
A = P*D*P-1
Where P is an orthogonal matrix and D is a diagonal matrix.
This decomposition can always be done on a symmetric matrix (Spectral Theorem). Since the product of a matrix and its transpose is always a symmetric matrix, this decomposition can always be done on A*A' and A'*A.
Note to self: In a revision, I'd like to say something about the relationship between the eigen decomposition and the ellipse equation. One form of the equation for the ellipse is ax2 + bx2 = 1.

While we're on the subject, it's interesting to consider the the square of a symmetric matrix given its eigen decomposition.
A2 = A*A = (P*D*P-1)*(P*D*P-1)
= P*D*P-1*P*D*P-1
= P*D*(P-1*P)*D*P-1
= P*D*I*D*P-1
= P*D*D*P-1
= P*D2*P-1
Note that squaring the matrix results in a squaring of D, and since D is a diagonal matrix, this amounts to a squaring of each element on D's diagonal.
We've gotten far enough to note more interesting properties of SVD:
1. The columns of U are the eigenvectors of A*A'
2. The columns of V are the eigenvectors of A'*A
3. The diagonal elements of S are the square roots of the eigenvalues of A*A' and A'*A.
Like many other proporties of the SVD laid out so far, these three facts also have profound implications. Hopefully, I'll have time to look into it much more deeply in a future post, but I think everything above is more than enough to ponder for now.
I'll leave with some equations that get us from the SVD to the eigen decompositions of A*A' and A'*A.
A = U*S*V'
A' = (U*S*V')'
A' = V*S*U'
A*A' = (U*S*V')*(V*S*U')
= U*S*V'*V*S*U'
= U*S*(V'*V)*S*U'
= U*S*I*S*U'
= U*S2*U'
A'*A = (V*S*U')*(U*S*V') =
V*S*U'*U*S*V' =
V*S*(U'*U)*S*V' =
V*S*I*S*V' =
V*S2* V'
Note: This is true when A is rectangular as well as square.
In part #7, I look at the SVD of a singular matrix.
No comments:
Post a Comment