Tuesday, December 1, 2020
Friday, October 16, 2020
Brain–Machine Interfaces: Closed-Loop Control in an Adaptive System
Monday, July 13, 2020
Convolution with the Hartley transform
The Hartley transform can be computed by summing the real and imaginary parts of the Fourier transform.
\begin{equation}\begin{aligned} \mathcal F \mathbf a &= \mathbf x + i\mathbf y \\ \mathcal H \mathbf a &= \mathbf x + \mathbf y, \end{aligned}\end{equation}where $\mathbf a$, $\mathbf x$, and $\mathbf y$ are real-valued vectors, $\mathcal F$ is the Fourier transform, and $\mathcal H$ is the Hartley transform. It has several useful properties.
- It is unitary, and also an involution: it is its own inverse.
- Its output is real-valued, so it can be used with numerical routines that cannot handle complex numbers.
- It can be computed in $\mathcal O (n \log(n))$ time using standard Fast Fourier Transform (FFT) libraries.
Tuesday, May 12, 2020
Gaussian process models for hippocampal grid cells
Gaussian Processes (GPs) generalize the idea of multivariate Gaussian distributions to distributions over functions. In neuroscience, they can be used to estimate how the firing rate of a neuron varies as a function of other variables (e.g. to track retinal waves ). Lately, we've been using Gaussian processes to describe the firing rate map of hippocampal grid cells .
We review Bayesian inference and Gaussian processes, explore applications of Gaussian Processes to analyzing grid cell data, and finally construct a GP model of the log-rate that accounts for the Poisson noise in spike count data. Along the way, we discuss fast approximations for these methods, like kernel density estimation , or approximating GP inference using convolutions.
Edit: There is a bug in the "covariance_crosshairs" function, there should be a square-root around "chi2.isf(1-p,df=2)".