These notes contain some derivations for variational inference in Poisson and probit Generalized Linear Models (GLMs) with a Gaussian prior and approximated Gaussian posterior. ( see also here. )
Problem statement
Consider a population of neurons with firing-intensities
where
Assume that the synaptic activations
We want to infer the distribution of
Variational Bayes
In variational Bayes, the posterior on
(In these notes, all expectations
Since both
For our choice of the canonically-parameterized natural exponential family, the expected negative log-likelihood can be written as:
Neglecting constants and terms that do not depend on
Closed-form expectations
To optimize
Because we've assumed a Gaussian posterior on our latent state
Consider a single, scalar
For more compact notation, denote the expected firing rate as
Closed-form expressions for
-Choosing
For the probit firing-rate nonlinearity, we will also need to know
Gradient in
The gradient of
where
The Hessian-vector product for the term
The first step in
For the exponential firing-rate nonlinearity,
Conclude
That's all for now! I'll need to integrate these with the various other derivations ( e.g. see also here. ).