Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Features: maximize information entropy and variable reporting interval. #170

Merged
merged 2 commits into from
Jul 9, 2024

Conversation

rouson
Copy link
Contributor

@rouson rouson commented Jul 9, 2024

This pull request adds --report and --bins command-line arguments in the train-cloud-microphysics app:

  1. --report sets the reporting interval, i.e., the number of epochs between printing the cost function to the screen and writing it to the cost.plt file.
  2. --bins sets the number of phase-space hypercubes across the range of each of the 5 output variables. Only one data point per bin retained for using in training a neural network. This choice maximizes the information entropy for a given retained training data set size.

The new argument sets the reporting interval, i.e., the number of
epochs between printing the cost function to the screen and writing
it to the cost.plt file.
This commit adds the `--bins` command-line argument, which sets
the number of phase-space hypercubes across the range of each of
the 5 output variables.  Only one data point per bin retained
for use in training a neural network.  This choice maximizes the
information entropy for a given retained training data set size.
@rouson rouson requested a review from davytorres July 9, 2024 21:39
Copy link
Collaborator

@davytorres davytorres left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@davytorres davytorres merged commit 85f5a03 into main Jul 9, 2024
4 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants