-
Notifications
You must be signed in to change notification settings - Fork 6
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Quadrature (and Monte Carlo) for computing variational expectations #3
Comments
I am afraid that only works for the Gaussian likelihood :( |
Isn't it possible for some - e.g. for Gamma here? |
Closed-form posterior only for Gaussian likelihood, but closed-form |
I have a working end-to-end example of classification in https://github.com/rossviljoen/SparseGPs.jl/blob/master/examples/classification.jl |
Implemented in #9 |
For many non-Gaussian likelihoods, the expectation in the first term in the ELBO
E_q(f) [ \log p(y|f) ]
from [1] has no closed form solution, so it needs to approximated somehow - typically by Gauss-Hermite quadrature, but possibly also via Monte Carlo. So, the planned approach for implementing this is:[1] Hensman, James, Alexander Matthews, and Zoubin Ghahramani. "Scalable variational Gaussian process classification." Artificial Intelligence and Statistics. PMLR, 2015.
The text was updated successfully, but these errors were encountered: