It looks easy enough, except that here Q is a 4 x 4 matrix. But linear algebra teaches us that if we can find the eigenvalues and eigenvectors of Q, we can form a succession of 3 matrices such that

where &Lambda is a diagonal matrix. The upshot of that is that any function we want to do on Q is equal to:

U and U

^{-1}are the right and left eigenvectors of Q, and moreover as the notation implies:

So, let's see where that takes us. Long ago, we had this matrix:

Using the

*characteristic equation*we found eigenvalues

We also found eigenvectors v, one for each eigenvalue, such that:

For example:

Checking our multiplication, rows x columns:

We scale v to be a unit vector:

And the second eigenvector is:

In R:

Any linear multiple of v is still an eigenvector, so it's OK that R has our v

_{2}x -1.

So far, so good. Now, Yang says that what we need to do is "decompose" our matrix into right

*and left*eigenvectors. We have a right eigenvector, but now we need a new left eigenvector for each eigenvalue such that:

`vA = vλ = λv`

Luckily, we can use inverses. He says that we want:

`Q = U Λ U`^{-1}

Where U is the right eigenvector of Q and the "rows of " U

^{-1}are the left eigenvector of Q. &Lambda is a diagonal vector with the eigenvalues λ. Here is the code separately (makes it easier for you to run it):

and what it prints:

I would just note in passing that although svd sounds like what we want (singular value decomposition), it doesn't give the values we're looking for. I'm not sure why yet.

Let's try a phylogenetics example using a simple matrix like Jukes-Cantor's Q:

Check.

Now, what we need to do is to exponentiate L. That is, we do exp { λt } for each eigenvalue, and place the result in a diagonal matrix. The code:

It prints:

Looks good. How about doing it in Python?

Output:

Code listing: