Friday, May 13, 2011

Correlation, mean-squared-error, mutual information, and signal-to-noise ratio for Gaussian random variables

I was reading a paper, and encountered a figure that showed the correlation, mutual information, and mean-squared prediction error, for a pair of time-series. This seemed a bit redundant. It turns out it was added to the paper on the request of a reviewer. If your data are jointly Gaussian, these all measure the same thing; no need to clutter a figure by showing all of them.

[get these notes as PDF]

For a jointly Gaussian pair of random variables, correlation, root mean squared error, correlation, and signal to noise ratio, all measure the same thing and can be computed from each-other.