I was reading a paper, and encountered a figure that showed the correlation, mutual information, and mean-squared prediction error, for a pair of time-series. This seemed a bit redundant. It turns out it was added to the paper on the request of a reviewer. If your data are jointly Gaussian, these all measure the same thing; no need to clutter a figure by showing all of them.
For a jointly Gaussian pair of random variables, correlation, root mean squared error, correlation, and signal to noise ratio, all measure the same thing and can be computed from each-other.