Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature Request: mix arguments of different-sized arrays (& floats)? #982

Open
pfackeldey opened this issue Apr 5, 2024 · 6 comments
Open

Comments

@pfackeldey
Copy link

Dear iminuit developers,

Thank you very much for this great package!

I am the author of evermore - a pure JAX based package to build binned likelihoods in HEP. Here, one can construct arbitrary PyTrees of nuisances parameters and use them in their loss function. It is highly efficient to be able to group parameters into arrays to modify bin content in a vectorized fashion (especially for barlow-beeston[-lite]). Users have some parameters that are just single values (e.g. a single cross section uncertainty), and some that are represented as arrays (e.g. barlow-beeston statistical uncertainties).

Thus, I'd like to ask if it is possible to add the feature to use mixtures of different sized arrays (and floats), e.g.:

import numpy as np
from iminuit import Minuit

def fun(x, c):
    return c + x[0]**2 + x[1]**4
    
Minuit(fun, x=np.ones(2), c=np.ones(1))

This is in particular handy when working with JAX loss functions where the parameters (x, c) are often in practise a nested PyTree of jax.Arrays of arbitrary size:

import jax.numpy as jnp
import jax.tree_util as jtu
from functools import partial

params = {"x": jnp.ones(2), "c": jnp.ones(1)}

def fun(params):
    x = params["x"]
    c = params["c"]
    return c + x[0]**2 + x[1]**4

def wrapped_fun(flat_params, treedef):
    params = jtu.tree_unflatten(treedef, flat_params)
    return fun(params)
    
flat_params, treedef = jtu.tree_flatten(params)

Minuit(partial(wrapped_fun, treedef=treedef), flat_params, name=treedef.node_data()[1])

In this example params is just a simple dictionary, but this would also work with arbitrary (nested) PyTree structures if iminuit could support arrays of arbitrary size for the loss function kwargs.

Best, Peter

PS: JAX optimisers, i.e. optax, can minimise directly w.r.t these PyTree structures. The minimiser returns the original PyTree structure, but its leafs contain the fitted parameter values. Here, one does not even need the step of the wrapped_fun to convert any PyTree to a list of arguments.

@HDembinski
Copy link
Member

HDembinski commented Apr 12, 2024

Hi, I considered a similar feature every now and again, but there didn't seem to be a good enough use case for putting in the time.

evermore sounds like an interesting library, I considered writing something like that myself (as I am also a fan of JAX).

If you have differentiable likelihoods in JAX, I don't see why you would need iminuit. You can use the optax minimizers and you can compute uncertainties as well, at least the analog to the HESSE algorithm in MINUIT. You need to compute the hessian at the minimum with JAX and invert it. If your original function has a negative log-likelihood, then this produces the covariance matrix of the parameters.

@pfackeldey
Copy link
Author

Hi @HDembinski,

thank you very much for your reply :)

Indeed it is possible to just use first order minimizer and to compute the hessian at the minimum with JAX and invert it. However, I am currently in the process of comparing evermore's features with similar tools. These tools always/only use Minuit for minimization, so a fair comparison of e.g. a likelihood profile between evermore and these tools would be to use the same minimizer.
Apart from that I received general feedback that people like to use Minuit because of its robustness and its potential to reach the minimum faster than 1st order minimizer.

These two points are my main motivation to use iminuit with evermore.

Best, Peter

PS: But I agree fully with you... I personally had a pretty robust and fast experience with optax.sgd so far - even for HEP-like fitting problems.

@HDembinski
Copy link
Member

Regarding BarlowBeeston, I recommend to have a look at our new method if you are not already aware of it. It is implemented as the default in the class Template and we also published a paper about it.

@HDembinski
Copy link
Member

HDembinski commented Aug 1, 2024

I think the proper thing is to support directly functions which accept a pytree instead of an array, as a generalization of the array-call mode that iminuit supports. Then you don't have to write wrapped_fun.

@pfackeldey
Copy link
Author

I think the proper thing is to support directly functions which accept a pytree instead of an array, as a generalization of the array-call mode that iminuit supports. Then you don't have to write wrapped_fun.

That's actually what I (encourage to) do in evermore (see my "PS:" in my first comment"). This support for arbitrary PyTrees of arrays would be great - I'd be happy to see it!
This is btw also how optax works, i.e., you can optimise an arbitrary PyTree of parameters/arrays.

@HDembinski
Copy link
Member

I know how optax works :).

I quickly looked into it, but supporting pytrees directly in iminuit does not seem to be a good fit. The idea that parameters are discoverable from the function signature of the cost function is deeply engrained in the library:

def cost(par1, ..., parN):

where par1 to parN are floats. iminuit detects the number of parameters and their names in this way. If you don't pass the parameters explicitly, iminuit cannot automatically detect the number of parameters and their names.

When I took over maintainership, I pushed through to also support cost functions which accept arrays

def cost(par_array):

which already created the problem that one cannot detect parameter names anymore, and these have to be passed explicit with name=.... I now regret this, because when I start to develop the builtin cost functions in iminuit.cost, I realized how challenging it is to support models in both formats.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants