Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adds Monte Carlo Samplers #340

Merged
merged 44 commits into from
Sep 2, 2024
Merged
Show file tree
Hide file tree
Changes from 25 commits
Commits
Show all changes
44 commits
Select commit Hold shift + click to select a range
89ae3e6
feat: initial Monte Carlo classes
BradyPlanden May 24, 2024
3c48f88
feat: updt __init__.py, add LogPosterior
BradyPlanden May 24, 2024
a62a126
tests: add unit tests for MCMC samplers
BradyPlanden Jun 3, 2024
90bb173
Merge branch 'develop' into monte-carlo-methods
BradyPlanden Jun 3, 2024
fbe56a4
fix parallel for windows
BradyPlanden Jun 3, 2024
8f74b6d
tests: additional unit tests, refactors priors class
BradyPlanden Jun 3, 2024
2d83315
tests: increase coverage, adds monte carlo integration test
BradyPlanden Jun 5, 2024
5ed3d23
tests: increase coverage, bugfix multi_log_pdf logic
BradyPlanden Jun 5, 2024
ca961a2
tests: increase coverage, update priors on intesampling integration t…
BradyPlanden Jun 5, 2024
da21506
tests: increment coverage, refactor prior np.inf catch
BradyPlanden Jun 5, 2024
ce1cb54
refactor: removes redundant code
BradyPlanden Jun 5, 2024
c86531b
Merge branch 'develop', updts for Parameters class
BradyPlanden Jun 7, 2024
3e4c01e
refactor: adds improvements from parameters class
BradyPlanden Jun 7, 2024
b5ec8fe
feat: Adds burn-in functionality for sampling class
BradyPlanden Jun 15, 2024
f71bf6a
Merge branch 'develop' into monte-carlo-methods
BradyPlanden Jun 17, 2024
1f7c6cb
Merge branch 'develop' into monte-carlo-methods
BradyPlanden Jun 18, 2024
c8db9f5
fix: correct sigma0 to cov0
BradyPlanden Jun 18, 2024
eaaebb2
Merge branch 'develop' into monte-carlo-methods
BradyPlanden Jul 3, 2024
fb97b5d
refactor: move general methods into parent class, replace burn_in wit…
BradyPlanden Jul 3, 2024
942dc5e
Merge branch 'develop' into monte-carlo-methods
BradyPlanden Jul 4, 2024
990c590
Apply suggestions from code review
BradyPlanden Jul 4, 2024
5f89231
Merge branch 'develop' into monte-carlo-methods
BradyPlanden Jul 16, 2024
74b1f30
Merge branch 'develop' into monte-carlo-methods
BradyPlanden Jul 16, 2024
13aa83f
refactor: log_pdf to base class, update docstrings.
BradyPlanden Jul 21, 2024
0b32889
Adds catches and initialisation for x0, update tests
BradyPlanden Jul 21, 2024
0117066
Merge branch 'refs/heads/develop' into monte-carlo-methods
BradyPlanden Aug 6, 2024
251f86f
feat: updates for transformation class, cleanup
BradyPlanden Aug 7, 2024
5bb4d94
fix: uniformly apply bound transformations, update LogPosterior
BradyPlanden Aug 7, 2024
a5244a4
Merge branch 'refs/heads/develop' into monte-carlo-methods
BradyPlanden Aug 7, 2024
255aa5d
Merge branch 'refs/heads/develop' into monte-carlo-methods
BradyPlanden Aug 7, 2024
c9946da
Merge branch 'refs/heads/develop' into monte-carlo-methods
BradyPlanden Aug 13, 2024
cd07072
refactor: ComposedLogPrior -> JointLogPrior, prior.evaluateS1 -> logp…
BradyPlanden Aug 13, 2024
08dc407
fix: update tests for low convergence sampler
BradyPlanden Aug 13, 2024
c225065
refactor: update priors, refactor JointLogPrior
BradyPlanden Aug 14, 2024
4df0885
tests: update unit tests and increase coverage.
BradyPlanden Aug 14, 2024
e50812a
refactor: base_sampler init, update docstrings, update tests, remove …
BradyPlanden Aug 14, 2024
7a000cf
tests: increase coverage, remove redundant ValueError, sampler.chains…
BradyPlanden Aug 14, 2024
711dcc8
tests: restore parallel optimisation with thread limit to 1
BradyPlanden Aug 14, 2024
df1cc73
Merge branch 'refs/heads/develop' into monte-carlo-methods
BradyPlanden Aug 22, 2024
bca3bbb
Merge branch 'refs/heads/develop' into monte-carlo-methods
BradyPlanden Aug 22, 2024
248b161
Merge branch 'refs/heads/develop' into monte-carlo-methods
BradyPlanden Aug 29, 2024
503af19
Refactor and bugfixes. Adds gradient-based integration sampling tests…
BradyPlanden Aug 29, 2024
85e1ce1
Remainder review suggestions, update assert tolerances, small array d…
BradyPlanden Aug 30, 2024
8a928af
tests: increment iterations from scheduled test run
BradyPlanden Sep 2, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,8 @@

## Features


- [#6](https://github.com/pybop-team/PyBOP/issues/6) - Adds Monte Carlo functionality, with methods based on Pints' algorithms. A base class is added `BaseSampler`, in addition to `PintsBaseSampler`.
- [#393](https://github.com/pybop-team/PyBOP/pull/383) - Adds Minkowski and SumofPower cost classes, with an example and corresponding tests.
- [#403](https://github.com/pybop-team/PyBOP/pull/403/) - Adds lychee link checking action.

Expand Down
90 changes: 90 additions & 0 deletions examples/scripts/mcmc_example.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,90 @@
import numpy as np
import plotly.graph_objects as go

import pybop

# Parameter set and model definition
parameter_set = pybop.ParameterSet.pybamm("Chen2020")
synth_model = pybop.lithium_ion.DFN(parameter_set=parameter_set)

# Fitting parameters
parameters = [
pybop.Parameter(
"Negative electrode active material volume fraction",
prior=pybop.Gaussian(0.68, 0.05),
),
pybop.Parameter(
"Positive electrode active material volume fraction",
prior=pybop.Gaussian(0.58, 0.05),
),
]

# Generate data
init_soc = 1.0
sigma = 0.001
experiment = pybop.Experiment(
[
(
"Discharge at 0.5C until 2.5V (10 second period)",
"Charge at 0.5C until 4.2V (10 second period)",
),
]
# * 2
)
values = synth_model.predict(init_soc=init_soc, experiment=experiment)


def noise(sigma):
return np.random.normal(0, sigma, len(values["Voltage [V]"].data))


# Form dataset
dataset = pybop.Dataset(
{
"Time [s]": values["Time [s]"].data,
"Current function [A]": values["Current [A]"].data,
"Voltage [V]": values["Voltage [V]"].data + noise(sigma),
"Bulk open-circuit voltage [V]": values["Bulk open-circuit voltage [V]"].data
+ noise(sigma),
}
)

model = pybop.lithium_ion.SPM(parameter_set=parameter_set)
signal = ["Voltage [V]", "Bulk open-circuit voltage [V]"]

# Generate problem, likelihood, and sampler
problem = pybop.FittingProblem(
model, parameters, dataset, signal=signal, init_soc=init_soc
)
likelihood = pybop.GaussianLogLikelihoodKnownSigma(problem, sigma0=0.002)
prior1 = pybop.Gaussian(0.7, 0.1)
prior2 = pybop.Gaussian(0.6, 0.1)
composed_prior = pybop.ComposedLogPrior(prior1, prior2)
posterior = pybop.LogPosterior(likelihood, composed_prior)

optim = pybop.DREAM(
posterior,
chains=4,
x0=[0.68, 0.58],
max_iterations=300,
burn_in=100,
# parallel=True, # uncomment to enable parallelisation (MacOS/WSL/Linux only)
)
result = optim.run()

# Create a histogram
fig = go.Figure()
for _i, data in enumerate(result):
fig.add_trace(go.Histogram(x=data[:, 0], name="Neg", opacity=0.75))
fig.add_trace(go.Histogram(x=data[:, 1], name="Pos", opacity=0.75))

# Update layout for better visualization
fig.update_layout(
title="Posterior distribution of volume fractions",
xaxis_title="Value",
yaxis_title="Count",
barmode="overlay",
)

# Show the plot
fig.show()
29 changes: 24 additions & 5 deletions pybop/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -60,7 +60,7 @@
#
from .parameters.parameter import Parameter, Parameters
from .parameters.parameter_set import ParameterSet
from .parameters.priors import BasePrior, Gaussian, Uniform, Exponential
from .parameters.priors import BasePrior, Gaussian, Uniform, Exponential, ComposedLogPrior

#
# Model classes
Expand All @@ -72,14 +72,14 @@
from .models.base_model import Inputs

#
# Problem class
# Problem classes
#
from .problems.base_problem import BaseProblem
from .problems.fitting_problem import FittingProblem
from .problems.design_problem import DesignProblem

#
# Cost function class
# Cost classes
#
from .costs.base_cost import BaseCost
from .costs.fitting_costs import (
Expand All @@ -98,11 +98,12 @@
BaseLikelihood,
GaussianLogLikelihood,
GaussianLogLikelihoodKnownSigma,
LogPosterior,
MAP,
)

#
# Optimiser class
# Optimiser classes
#

from .optimisers._cuckoo import CuckooSearchImpl
Expand All @@ -128,14 +129,32 @@
)
from .optimisers.optimisation import Optimisation

#
# Monte Carlo classes
#
from .samplers import BaseSampler
from .samplers.base_mcmc import BasePintsSampler
from .samplers.pints_samplers import (
NUTS, DREAM, AdaptiveCovarianceMCMC,
DifferentialEvolutionMCMC, DramACMC,
EmceeHammerMCMC,
HaarioACMC, HaarioBardenetACMC,
HamiltonianMCMC, MALAMCMC,
MetropolisRandomWalkMCMC, MonomialGammaHamiltonianMCMC,
PopulationMCMC, RaoBlackwellACMC,
RelativisticMCMC, SliceDoublingMCMC,
SliceRankShrinkingMCMC, SliceStepoutMCMC,
)
from .samplers.mcmc_sampler import MCMCSampler

#
# Observer classes
#
from .observers.unscented_kalman import UnscentedKalmanFilterObserver
from .observers.observer import Observer

#
# Plotting class
# Plotting classes
#
from .plotting.plotly_manager import PlotlyManager
from .plotting.quick_plot import StandardPlot, StandardSubplot, plot_trajectories
Expand Down
96 changes: 96 additions & 0 deletions pybop/costs/_likelihoods.py
Original file line number Diff line number Diff line change
Expand Up @@ -352,3 +352,99 @@ def _evaluateS1(self, inputs: Inputs) -> tuple[float, np.ndarray]:
total_gradient = dl + prior_gradient

return posterior, total_gradient


class LogPosterior(BaseCost):
BradyPlanden marked this conversation as resolved.
Show resolved Hide resolved
"""
The Log Posterior for a given problem.

Computes the log posterior which is the sum of the log
likelihood and the log prior.

Inherits all parameters and attributes from ``BaseCost``.
"""

def __init__(self, log_likelihood, log_prior=None):
super().__init__(problem=log_likelihood.problem)

# Store the likelihood and prior
self._log_likelihood = log_likelihood
self._prior = log_prior
if self._prior is None:
try:
self._prior = log_likelihood.problem.parameters.priors()
except Exception as e:
raise ValueError(
f"An error occurred when constructing the Prior class: {e}"
) from e

def _evaluate(self, x, grad=None):
"""
Calculate the posterior cost for a given set of parameters.

Parameters
----------
x : array-like
The parameters for which to evaluate the cost.
grad : array-like, optional
An array to store the gradient of the cost function with respect
to the parameters.

Returns
-------
float
The posterior cost.
"""
prior = self._prior(x)
if not np.isfinite(prior):
return prior
return prior + self._log_likelihood.evaluate(x)

def _evaluateS1(self, x):
"""
Compute the posterior with respect to the parameters.
The method passes the likelihood gradient to the optimiser without modification.

Parameters
----------
x : array-like
The parameters for which to compute the cost and gradient.

Returns
-------
tuple
A tuple containing the cost and the gradient. The cost is a float,
and the gradient is an array-like of the same length as `x`.

Raises
------
ValueError
If an error occurs during the calculation of the cost or gradient.
"""
prior, dp = self._prior.evaluateS1(x)
if not np.isfinite(prior):
return prior, dp
likelihood, dl = self._log_likelihood.evaluateS1(x)
return prior + likelihood, dp + dl

def prior(self):
"""
Return the prior object.

Returns
-------
object
The prior object.
"""
return self._prior

def likelihood(self):
"""
Returns the likelihood.

Returns
-------
object
The likelihood object.
"""
return self._log_likelihood
2 changes: 1 addition & 1 deletion pybop/optimisers/base_optimiser.py
Original file line number Diff line number Diff line change
Expand Up @@ -104,7 +104,7 @@ def __init__(
self.set_base_options()
self._set_up_optimiser()

# Throw an warning if any options remain
# Throw a warning if any options remain
if self.unset_options:
warnings.warn(
f"Unrecognised keyword arguments: {self.unset_options} will not be used.",
Expand Down
6 changes: 6 additions & 0 deletions pybop/parameters/parameter.py
Original file line number Diff line number Diff line change
Expand Up @@ -372,6 +372,12 @@ def get_sigma0(self) -> list:

return sigma0

def priors(self) -> list:
BradyPlanden marked this conversation as resolved.
Show resolved Hide resolved
"""
Return the prior distribution of each parameter.
"""
return [param.prior for param in self.param.values()]

def initial_value(self) -> np.ndarray:
"""
Return the initial value of each parameter.
Expand Down
Loading
Loading