Still fooling around with matplotlib. This example is similar to the first one (here) but goes further, and computes the stuff we'd need for principal component analysis.
I'm still floundering a bit: it seems to be impossible or at least I haven't found out how to plot open circles. Pretty amazing, since that's the default in R. 'markerfacecolor' and 'mfc' as mentioned in the docs for pyplot don't work with ax.scatter. Also, all attempts to adjust the axes are ignored. And most important, there are surely PCA routines to do this directly.
In the first part we do something very similar to the first example, except that the matrix is constructed from a list of lists, properly, so that we don't need a call to reset the shape to a 2-row by x-col one. Also, in anticipation of part 2, I've changed the example so that the y-axis is the one with most of the variance. These points are plotted as salmon diamonds. Then, the first matrix of points is rotated by 45 degrees as before, and these points are plotted as cyan squares.
In the second part of the code, we use
np.covto compute the covariance of the rotated matrix, and then call
np.linalg.eigon it, to compute the eigenvalues and eigenvectors. The eigenvalues are not guaranteed to be in any sorted order (who knows why?). To prove that the eigenvector is correct, we rotate the matrix again using that and plot those points as well. As expected, they are now spread along the x-axis.
The output (formatted by the function
These are the variances of the x and y-points on line 1, followed by the two eigenvalues and the fraction of the total variance explained by the second (large) eigenvalue. As expected, this is about 99%.