Skip to content

Train cloud microphysics

Compare
Choose a tag to compare
@rouson rouson released this 26 Nov 04:54
· 336 commits to main since this release
9673024

This is the first release that contains an app/train_cloud_microphysics.f90 program and a training_configuration.json file that exhibits convergent behavior of the (Adam) training algorithm on an ICAR-generated training data set by as demonstrated by a monotonically decreasing cost function:

    ./build/run-fpm.sh run train-cloud-microphysics -- --base training --epochs 10 --start 720
    ...
            Epoch         Cost (avg)
               1  0.121759593
               2   1.61784310E-02
               3   5.31613547E-03
               4   2.68347375E-03
               5   1.63242721E-03
               6   1.11283606E-03
               7   8.27088661E-04
               8   6.59517595E-04
               9   5.56710584E-04
              10   4.91619750E-04
     Training time:    39.319034000000002      for          10 epochs
     System clock time:    353.68379099999999

What's Changed

  • fix(deploy-docs.yml) - use linuxbrew to install ford 7 by @rouson in #99
  • Train Thompson microphysics proxy by @rouson in #98
  • doc(README): clarify language by @rouson in #100

Full Changelog: 0.9.0...0.10.0