You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
As we work on running more than one Ersilia model in parallel, @JHlozek highlighted the scenario where we want to run the same model in multiple processes/terminals. This would be a very interesting case to consider in repositories like Olinda, where we need to make precalculations across a large set of inputs.
Objective(s)
Run the same model in multiple terminals/processes (i.e. sessions).
Optionally, make sure parallelization works both in docker and in conda serving modes.
Ideally, parallelization should work both from the CLI (i.e. ersilia run -i ...) and from the Python API (mdl.run(...)). The Python API parallelization may be more difficult and it is less critical.
Documentation
No specific documentation available for this, although we should include parallelization as part of our main documentation in Gitbook and the README file.
The text was updated successfully, but these errors were encountered:
Summary
As we work on running more than one Ersilia model in parallel, @JHlozek highlighted the scenario where we want to run the same model in multiple processes/terminals. This would be a very interesting case to consider in repositories like Olinda, where we need to make precalculations across a large set of inputs.
Objective(s)
ersilia run -i ...
) and from the Python API (mdl.run(...)
). The Python API parallelization may be more difficult and it is less critical.Documentation
No specific documentation available for this, although we should include parallelization as part of our main documentation in Gitbook and the README file.
The text was updated successfully, but these errors were encountered: