In keeping with the spirit I've described, I'll post my homework here.

Suppose we have a 2 x 2 matrix A (i.e. of order 2):

The

*characteristic polynomial of A*is given by:

I is the identity matrix of order 2:

And k is a variable (I'm using k instead of λ, the traditional symbol).

Remembering how to evaluate the determinant of a 2 x 2 matrix:

so:

to get the characteristic equation, set:

The solutions for k are then 8 and -1. These are the

*eigenvalues*of A. For some reason, which I don't know, it is also true that:

(I'm using ^ here in the code to indicate a superscript). So we can check our math:

The eigenvectors of A satisfy this equation:

where v is a vector with values x and y:

So for the first eigenvalue we have:

That is:

and

We can choose x = 2, then y = 3 and

Check our math:

Similarly, for the second eigenvalue we have:

That is:

and

Choose x = 1, then y = -3

Check the math:

The eigenvectors are usually ordered according to the magnitude of the corresponding eigenvalue. That's why 8 is the

*first*eigenvalue. The eigenvectors can be normalized to have length = 1. Here, the length of v1:

is

so, the normalized v1 is:

The trace of a vector A is the sum along the diagonal

The determinant of A is (above) = -8

The trace of a vector is also equal to the sum of

the eigenvalues (sum of k's).

The determinant of a vector is the product of the k's. Since k1 = 8 and k2 = -1, we can verify the relationships are correct for matrix A.

In R:

Why do eigenvalues and eigenvectors matter to us? It turns out that if we have an n x p matrix like

where the x and y values are separately normalized. (They are z-scores, obtained by subtracting the mean for each column and then dividing by its standard deviation)

We construct a covariance matrix:

where

Then the eigenvalues and eigenvectors of C identify the principal components of B.

[UPDATE: This last part is not correct. I confused the correlation and covariance matrix. See here for details.]