Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow Bayesian Optimization to use Pre-trained GP ROM #2280

Merged
merged 6 commits into from
Mar 11, 2024

Conversation

wangcj05
Copy link
Collaborator

@wangcj05 wangcj05 commented Mar 8, 2024


Pull Request Description

What issue does this change request address? (Use "#" before the issue to link it, i.e., #42.)

close #2281

What are the significant changes in functionality due to this change request?

For experimental design, utilize existing dataset from experiment to guide the future exploration and exploitation will benefit optimal experimental design. This PR will allow pre-trained GP ROM to be used in the Bayesian Optimization. The pre-trained GP ROM is trained with existing experimental dataset.


For Change Control Board: Change Request Review

The following review must be completed by an authorized member of the Change Control Board.

  • 1. Review all computer code.
  • 2. If any changes occur to the input syntax, there must be an accompanying change to the user manual and xsd schema. If the input syntax change deprecates existing input files, a conversion script needs to be added (see Conversion Scripts).
  • 3. Make sure the Python code and commenting standards are respected (camelBack, etc.) - See on the wiki for details.
  • 4. Automated Tests should pass, including run_tests, pylint, manual building and xsd tests. If there are changes to Simulation.py or JobHandler.py the qsub tests must pass.
  • 5. If significant functionality is added, there must be tests added to check this. Tests should cover all possible options. Multiple short tests are preferred over one large test. If new development on the internal JobHandler parallel system is performed, a cluster test must be added setting, in XML block, the node <internalParallel> to True.
  • 6. If the change modifies or adds a requirement or a requirement based test case, the Change Control Board's Chair or designee also needs to approve the change. The requirements and the requirements test shall be in sync.
  • 7. The merge request must reference an issue. If the issue is closed, the issue close checklist shall be done.
  • 8. If an analytic test is changed/added is the the analytic documentation updated/added?
  • 9. If any test used as a basis for documentation examples (currently found in raven/tests/framework/user_guide and raven/docs/workshop) have been changed, the associated documentation must be reviewed and assured the text matches the example.

dylanjm
dylanjm previously approved these changes Mar 8, 2024
Copy link
Collaborator

@dylanjm dylanjm left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Changes look good, I do have one question, but probably nothing to hold up the merging process.

@@ -235,6 +236,8 @@ def initialize(self, externalSeeding=None, solutionExport=None):
self.raiseAnError(RuntimeError, f'GPR ROM <target> should be obective variable: {self._objectiveVar}, '
f'Received {self._model.supervisedContainer[0].target}')

if self._resetModel:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I can't find any logic in this file that would switch the default value of False for _resetModel to True. Do you expect this value to change in this before this branch of code?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This variable has been defined in init method. Currently, the default is False. I define this variable to allow us to define a XML node in the input file to control it if needed. Currently, the default behavior works fine.

Copy link
Collaborator

@dylanjm dylanjm left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Changes are good and are ready to merge.

@dylanjm dylanjm merged commit d02c25d into idaholab:devel Mar 11, 2024
12 checks passed
@wangcj05 wangcj05 deleted the wangc/BayesianOpt branch March 21, 2024 21:11
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[TASK] Allow Bayesian Optimization to use Pre-trained GP ROM
2 participants