I was reading a paper, and encountered a figure that showed the correlation, mutual information, and mean-squared prediction error, for a pair of time-series. This seemed a bit redundant. It turns out it was added to the paper on the request of a reviewer. If your data are jointly Gaussian, these all measure the same thing; no need to clutter a figure by showing all of them.
For a jointly Gaussian pair of random variables, correlation, root mean squared error, correlation, and signal to noise ratio, all measure the same thing and can be computed from each-other.
Some identities
Consider two time series
Let's say we're interested in a linear relationship between
The linear dependence of on is summarized by a single parameter
Since the signal and noise are independent, their variances combine linearly:
The sum
Incorporate this constraint by defining
and
(We'll show later that
From this the signal-to-noise ratio and mutual information can be calculated
The Signal-to-Noise Ratio (SNR) is the ratio of the signal and noise contributions to
On jointly Gaussian channels mutual information
Relationship between , , , and Pearson correlation
Since
Then,
This implies that
which implies that that
A few more identities
This can be used to relate correlation
If
Mean squared error (MSE) is also related :
which implies that
and gives a relationship between mutual information and mean squared error:
No comments:
Post a Comment