Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adds Monte Carlo Samplers #340

Merged
merged 44 commits into from
Sep 2, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
44 commits
Select commit Hold shift + click to select a range
89ae3e6
feat: initial Monte Carlo classes
BradyPlanden May 24, 2024
3c48f88
feat: updt __init__.py, add LogPosterior
BradyPlanden May 24, 2024
a62a126
tests: add unit tests for MCMC samplers
BradyPlanden Jun 3, 2024
90bb173
Merge branch 'develop' into monte-carlo-methods
BradyPlanden Jun 3, 2024
fbe56a4
fix parallel for windows
BradyPlanden Jun 3, 2024
8f74b6d
tests: additional unit tests, refactors priors class
BradyPlanden Jun 3, 2024
2d83315
tests: increase coverage, adds monte carlo integration test
BradyPlanden Jun 5, 2024
5ed3d23
tests: increase coverage, bugfix multi_log_pdf logic
BradyPlanden Jun 5, 2024
ca961a2
tests: increase coverage, update priors on intesampling integration t…
BradyPlanden Jun 5, 2024
da21506
tests: increment coverage, refactor prior np.inf catch
BradyPlanden Jun 5, 2024
ce1cb54
refactor: removes redundant code
BradyPlanden Jun 5, 2024
c86531b
Merge branch 'develop', updts for Parameters class
BradyPlanden Jun 7, 2024
3e4c01e
refactor: adds improvements from parameters class
BradyPlanden Jun 7, 2024
b5ec8fe
feat: Adds burn-in functionality for sampling class
BradyPlanden Jun 15, 2024
f71bf6a
Merge branch 'develop' into monte-carlo-methods
BradyPlanden Jun 17, 2024
1f7c6cb
Merge branch 'develop' into monte-carlo-methods
BradyPlanden Jun 18, 2024
c8db9f5
fix: correct sigma0 to cov0
BradyPlanden Jun 18, 2024
eaaebb2
Merge branch 'develop' into monte-carlo-methods
BradyPlanden Jul 3, 2024
fb97b5d
refactor: move general methods into parent class, replace burn_in wit…
BradyPlanden Jul 3, 2024
942dc5e
Merge branch 'develop' into monte-carlo-methods
BradyPlanden Jul 4, 2024
990c590
Apply suggestions from code review
BradyPlanden Jul 4, 2024
5f89231
Merge branch 'develop' into monte-carlo-methods
BradyPlanden Jul 16, 2024
74b1f30
Merge branch 'develop' into monte-carlo-methods
BradyPlanden Jul 16, 2024
13aa83f
refactor: log_pdf to base class, update docstrings.
BradyPlanden Jul 21, 2024
0b32889
Adds catches and initialisation for x0, update tests
BradyPlanden Jul 21, 2024
0117066
Merge branch 'refs/heads/develop' into monte-carlo-methods
BradyPlanden Aug 6, 2024
251f86f
feat: updates for transformation class, cleanup
BradyPlanden Aug 7, 2024
5bb4d94
fix: uniformly apply bound transformations, update LogPosterior
BradyPlanden Aug 7, 2024
a5244a4
Merge branch 'refs/heads/develop' into monte-carlo-methods
BradyPlanden Aug 7, 2024
255aa5d
Merge branch 'refs/heads/develop' into monte-carlo-methods
BradyPlanden Aug 7, 2024
c9946da
Merge branch 'refs/heads/develop' into monte-carlo-methods
BradyPlanden Aug 13, 2024
cd07072
refactor: ComposedLogPrior -> JointLogPrior, prior.evaluateS1 -> logp…
BradyPlanden Aug 13, 2024
08dc407
fix: update tests for low convergence sampler
BradyPlanden Aug 13, 2024
c225065
refactor: update priors, refactor JointLogPrior
BradyPlanden Aug 14, 2024
4df0885
tests: update unit tests and increase coverage.
BradyPlanden Aug 14, 2024
e50812a
refactor: base_sampler init, update docstrings, update tests, remove …
BradyPlanden Aug 14, 2024
7a000cf
tests: increase coverage, remove redundant ValueError, sampler.chains…
BradyPlanden Aug 14, 2024
711dcc8
tests: restore parallel optimisation with thread limit to 1
BradyPlanden Aug 14, 2024
df1cc73
Merge branch 'refs/heads/develop' into monte-carlo-methods
BradyPlanden Aug 22, 2024
bca3bbb
Merge branch 'refs/heads/develop' into monte-carlo-methods
BradyPlanden Aug 22, 2024
248b161
Merge branch 'refs/heads/develop' into monte-carlo-methods
BradyPlanden Aug 29, 2024
503af19
Refactor and bugfixes. Adds gradient-based integration sampling tests…
BradyPlanden Aug 29, 2024
85e1ce1
Remainder review suggestions, update assert tolerances, small array d…
BradyPlanden Aug 30, 2024
8a928af
tests: increment iterations from scheduled test run
BradyPlanden Sep 2, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@

## Features

- [#6](https://github.com/pybop-team/PyBOP/issues/6) - Adds Monte Carlo functionality, with methods based on Pints' algorithms. A base class is added `BaseSampler`, in addition to `PintsBaseSampler`.
- [#353](https://github.com/pybop-team/PyBOP/issues/353) - Allow user-defined check_params functions to enforce nonlinear constraints, and enable SciPy constrained optimisation methods
- [#222](https://github.com/pybop-team/PyBOP/issues/222) - Adds an example for performing and electrode balancing.
- [#441](https://github.com/pybop-team/PyBOP/issues/441) - Adds an example for estimating constants within a `pybamm.FunctionalParameter`.
Expand Down Expand Up @@ -38,7 +39,6 @@

## Bug Fixes


## Breaking Changes

# [v24.6](https://github.com/pybop-team/PyBOP/tree/v24.6) - 2024-07-08
Expand Down
94 changes: 94 additions & 0 deletions examples/scripts/mcmc_example.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,94 @@
import numpy as np
import plotly.graph_objects as go
import pybamm

import pybop

# Parameter set and model definition
solver = pybamm.IDAKLUSolver()
parameter_set = pybop.ParameterSet.pybamm("Chen2020")
parameter_set.update(
{
"Negative electrode active material volume fraction": 0.63,
"Positive electrode active material volume fraction": 0.71,
}
)
synth_model = pybop.lithium_ion.DFN(parameter_set=parameter_set, solver=solver)

# Fitting parameters
parameters = pybop.Parameters(
pybop.Parameter(
"Negative electrode active material volume fraction",
prior=pybop.Gaussian(0.68, 0.02),
transformation=pybop.LogTransformation(),
),
pybop.Parameter(
"Positive electrode active material volume fraction",
prior=pybop.Gaussian(0.65, 0.02),
transformation=pybop.LogTransformation(),
),
)

# Generate data
init_soc = 0.5
sigma = 0.002
experiment = pybop.Experiment(
[
("Discharge at 0.5C for 6 minutes (5 second period)",),
]
)
values = synth_model.predict(
initial_state={"Initial SoC": init_soc}, experiment=experiment
)


def noise(sigma):
return np.random.normal(0, sigma, len(values["Voltage [V]"].data))


# Form dataset
dataset = pybop.Dataset(
{
"Time [s]": values["Time [s]"].data,
"Current function [A]": values["Current [A]"].data,
"Voltage [V]": values["Voltage [V]"].data + noise(sigma),
"Bulk open-circuit voltage [V]": values["Bulk open-circuit voltage [V]"].data
+ noise(sigma),
}
)

model = pybop.lithium_ion.SPM(parameter_set=parameter_set, solver=pybamm.IDAKLUSolver())
model.build(initial_state={"Initial SoC": init_soc})
signal = ["Voltage [V]", "Bulk open-circuit voltage [V]"]

# Generate problem, likelihood, and sampler
problem = pybop.FittingProblem(model, parameters, dataset, signal=signal)
likelihood = pybop.GaussianLogLikelihoodKnownSigma(problem, sigma0=0.002)
posterior = pybop.LogPosterior(likelihood)

optim = pybop.DifferentialEvolutionMCMC(
posterior,
chains=3,
max_iterations=300,
warm_up=100,
verbose=True,
# parallel=True, # uncomment to enable parallelisation (MacOS/WSL/Linux only)
)
result = optim.run()

# Create a histogram
fig = go.Figure()
for _i, data in enumerate(result):
fig.add_trace(go.Histogram(x=data[:, 0], name="Neg", opacity=0.75))
fig.add_trace(go.Histogram(x=data[:, 1], name="Pos", opacity=0.75))

# Update layout for better visualization
fig.update_layout(
title="Posterior distribution of volume fractions",
xaxis_title="Value",
yaxis_title="Count",
barmode="overlay",
)

# Show the plot
fig.show()
29 changes: 24 additions & 5 deletions pybop/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -71,7 +71,7 @@
#
from .parameters.parameter import Parameter, Parameters
from .parameters.parameter_set import ParameterSet
from .parameters.priors import BasePrior, Gaussian, Uniform, Exponential
from .parameters.priors import BasePrior, Gaussian, Uniform, Exponential, JointLogPrior

#
# Model classes
Expand All @@ -83,15 +83,15 @@
from .models.base_model import Inputs

#
# Problem class
# Problem classes
#
from .problems.base_problem import BaseProblem
from .problems.fitting_problem import FittingProblem
from .problems.multi_fitting_problem import MultiFittingProblem
from .problems.design_problem import DesignProblem

#
# Cost function class
# Cost classes
#
from .costs.base_cost import BaseCost
from .costs.fitting_costs import (
Expand All @@ -110,12 +110,13 @@
BaseLikelihood,
GaussianLogLikelihood,
GaussianLogLikelihoodKnownSigma,
LogPosterior,
MAP,
)
from .costs._weighted_cost import WeightedCost

#
# Optimiser class
# Optimiser classes
#

from .optimisers._cuckoo import CuckooSearchImpl
Expand All @@ -141,14 +142,32 @@
)
from .optimisers.optimisation import Optimisation

#
# Monte Carlo classes
#
from .samplers.base_sampler import BaseSampler
from .samplers.base_pints_sampler import BasePintsSampler
from .samplers.pints_samplers import (
NUTS, DREAM, AdaptiveCovarianceMCMC,
DifferentialEvolutionMCMC, DramACMC,
EmceeHammerMCMC,
HaarioACMC, HaarioBardenetACMC,
HamiltonianMCMC, MALAMCMC,
MetropolisRandomWalkMCMC, MonomialGammaHamiltonianMCMC,
PopulationMCMC, RaoBlackwellACMC,
RelativisticMCMC, SliceDoublingMCMC,
SliceRankShrinkingMCMC, SliceStepoutMCMC,
)
from .samplers.mcmc_sampler import MCMCSampler

#
# Observer classes
#
from .observers.unscented_kalman import UnscentedKalmanFilterObserver
from .observers.observer import Observer

#
# Plotting class
# Plotting classes
#
from .plotting.plotly_manager import PlotlyManager
from .plotting.standard_plots import StandardPlot, StandardSubplot, plot_trajectories
Expand Down
92 changes: 91 additions & 1 deletion pybop/costs/_likelihoods.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@

from pybop.costs.base_cost import BaseCost
from pybop.parameters.parameter import Parameter, Parameters
from pybop.parameters.priors import Uniform
from pybop.parameters.priors import JointLogPrior, Uniform
from pybop.problems.base_problem import BaseProblem


Expand Down Expand Up @@ -290,3 +290,93 @@ def compute(
log_likelihood = self.likelihood.compute(y)
posterior = log_likelihood + log_prior
return posterior


class LogPosterior(BaseCost):
BradyPlanden marked this conversation as resolved.
Show resolved Hide resolved
"""
The Log Posterior for a given problem.

Computes the log posterior which is the sum of the log
likelihood and the log prior.

Inherits all parameters and attributes from ``BaseCost``.
"""

def __init__(self, log_likelihood, log_prior=None):
super().__init__(problem=log_likelihood.problem)

# Store the likelihood and prior
self._log_likelihood = log_likelihood
if log_prior is None:
self._prior = JointLogPrior(*log_likelihood.problem.parameters.priors())
else:
self._prior = log_prior

def compute(
self,
y: dict,
dy: np.ndarray = None,
calculate_grad: bool = False,
) -> Union[float, tuple[float, np.ndarray]]:
"""
Calculate the posterior cost for a given set of parameters.

Parameters
----------
x : array-like
The parameters for which to evaluate the cost.
grad : array-like, optional
An array to store the gradient of the cost function with respect
to the parameters.

Returns
-------
float
The posterior cost.
"""
# Verify we have dy if calculate_grad is True
self.verify_args(dy, calculate_grad)

if calculate_grad:
log_prior, dp = self._prior.logpdfS1(self.parameters.current_value())
else:
log_prior = self._prior.logpdf(self.parameters.current_value())

if not np.isfinite(log_prior).any():
return (-np.inf, -self.grad_fail) if calculate_grad else -np.inf

if calculate_grad:
log_likelihood, dl = self._log_likelihood.compute(
y, dy, calculate_grad=True
)

posterior = log_likelihood + log_prior
total_gradient = dl + dp

return posterior, total_gradient

log_likelihood = self._log_likelihood.compute(y)
posterior = log_likelihood + log_prior
return posterior

def prior(self):
"""
Return the prior object.

Returns
-------
object
The prior object.
"""
return self._prior

def likelihood(self):
"""
Returns the likelihood.

Returns
-------
object
The likelihood object.
"""
return self._log_likelihood
2 changes: 1 addition & 1 deletion pybop/optimisers/base_optimiser.py
Original file line number Diff line number Diff line change
Expand Up @@ -115,7 +115,7 @@ def __init__(
self.set_base_options()
self._set_up_optimiser()

# Throw an warning if any options remain
# Throw a warning if any options remain
if self.unset_options:
warnings.warn(
f"Unrecognised keyword arguments: {self.unset_options} will not be used.",
Expand Down
28 changes: 19 additions & 9 deletions pybop/parameters/parameter.py
Original file line number Diff line number Diff line change
Expand Up @@ -56,6 +56,9 @@
self.value = initial_value
self.transformation = transformation
self.applied_prior_bounds = False
self.bounds = None
self.lower_bounds = None
self.upper_bounds = None
self.set_bounds(bounds)
self.margin = 1e-4

Expand Down Expand Up @@ -162,17 +165,9 @@
if bounds is not None:
if bounds[0] >= bounds[1]:
raise ValueError("Lower bound must be less than upper bound")
elif self.transformation is not None:
self.lower_bound = np.ndarray.item(
self.transformation.to_search(bounds[0])
)
self.upper_bound = np.ndarray.item(
self.transformation.to_search(bounds[1])
)
else:
self.lower_bound = bounds[0]
self.upper_bound = bounds[1]

elif self.prior is not None:
self.applied_prior_bounds = True
self.lower_bound = self.prior.mean - boundary_multiplier * self.prior.sigma
Expand All @@ -181,6 +176,13 @@
else:
self.bounds = None
return
if self.transformation is not None:
self.lower_bound = np.ndarray.item(
self.transformation.to_search(self.lower_bound)
)
self.upper_bound = np.ndarray.item(
self.transformation.to_search(self.upper_bound)
)

self.bounds = [self.lower_bound, self.upper_bound]

Expand Down Expand Up @@ -409,6 +411,12 @@
sigma0 = None
return sigma0

def priors(self) -> list:
BradyPlanden marked this conversation as resolved.
Show resolved Hide resolved
"""
Return the prior distribution of each parameter.
"""
return [param.prior for param in self.param.values()]

def initial_value(self) -> np.ndarray:
"""
Return the initial value of each parameter.
Expand Down Expand Up @@ -541,7 +549,9 @@
"""
if inputs is None or isinstance(inputs, dict):
return inputs
elif (isinstance(inputs, list) and all(is_numeric(x) for x in inputs)) or all(
if isinstance(inputs, np.ndarray) and inputs.ndim == 0:
inputs = inputs[np.newaxis]

Check warning on line 553 in pybop/parameters/parameter.py

View check run for this annotation

Codecov / codecov/patch

pybop/parameters/parameter.py#L553

Added line #L553 was not covered by tests
if (isinstance(inputs, list) and all(is_numeric(x) for x in inputs)) or all(
is_numeric(x) for x in list(inputs)
):
return self.as_dict(inputs)
Expand Down
Loading
Loading