Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tutorial on writing custom training loops #895

Closed
michaeldeistler opened this issue Nov 30, 2023 · 1 comment
Closed

Tutorial on writing custom training loops #895

michaeldeistler opened this issue Nov 30, 2023 · 1 comment
Labels
documentation Improvements or additions to documentation

Comments

@michaeldeistler
Copy link
Contributor

Ideally, before doing this, we should swap out the flows such that the likelihood_estimator has a nicer API.

from sbi.inference import SNLE
from sbi.utils import likelihood_nn
from sbi.inference import likelihood_estimator_based_potential, MCMCPosterior


prior = BoxUniform(-torch.ones((2,)), torch.ones((2,)))
theta = prior.sample((1000,))
x = theta + torch.randn((1000, num_dim))
x_o = torch.randn((1, num_dim))

likelihood_estimator = likelihood_nn("maf")
# ... train however you want ...

potential_fn, parameter_transform = likelihood_estimator_based_potential(
    likelihood_estimator, prior, x_o
)
posterior = MCMCPosterior(
    potential_fn, proposal=prior, theta_transform=parameter_transform
)
@michaeldeistler michaeldeistler added the documentation Improvements or additions to documentation label Nov 30, 2023
@michaeldeistler
Copy link
Contributor Author

Duplicate of #766

@michaeldeistler michaeldeistler marked this as a duplicate of #766 Jan 16, 2024
@janfb janfb added this to the Pre Hackathon 2024 milestone Feb 6, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation Improvements or additions to documentation
Projects
None yet
Development

No branches or pull requests

2 participants