-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update starcheck and acis regress outputs #15
Conversation
These should be more portable (GRETA).
@@ -1,4 +1,6 @@ | |||
acisfp_check \ | |||
--outdir=out \ | |||
--oflsdir=/data/acis/LoadReviews/2018/MAY2818/oflsa \ | |||
--run-start=2018:142 | |||
--oflsdir=${SKA}/data/ska_testr/test_loads/2019/MAY2019/oflsa \ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
My thought was that we'd just make a tarball of these test_loads (the one test load) but not sure if we should require a PR to add to ska_sync before promoting this PR.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A PR to ska_sync would be fine and can be quick enough. And not to over-optimize, but I think that the vast majority (by file size) of what's in the standard load products is not needed to run load review.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK. In the new ska_testr data I made for this one week, I copied over only the minimal set of files starcheck and the acis checkers required anyway.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good.
What's going on with the HEAD-specific stuff? That seems at odds with the stated goal (which I highly endorse) of making the testing portable. It also just looks like double the code without double the coverage, but maybe am I missing something. |
Really the difference is that we'd also stated that the primary goal when running the acis checkers in regression was to confirm that they would work in their "flight" configuration. The flight configuration only works if you run out of /data/acis, because the 'acis' state builder walks the tree of the other /data/acis directories to build states. |
So basically, adding the more portable versions of the ACIS-checker tests tests the models and tests that cmd_states/sqlite are feeding those models correctly. I think this actually is a different coverage area (and a useful one to have on GRETA) but I don't think the state_builder='acis' tests should be removed. It is a different question if we want to configure things to skip those HEAD-specific tests by default. Also a different question if you want to:
|
acisfp_check \ | ||
--outdir=headout \ | ||
--oflsdir=/data/acis/LoadReviews/2018/MAY2818/oflsa \ | ||
--run-start=2018:142 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The default is --state-builder = 'acis'. This would be more obvious if I just called it out here.
In terms of bigger-picture getting all "supported" tests to pass on all platforms, is there a plan to include officially configured test spec files? By "supported" I mean that some tests might not be supported on certain platforms but that is OK. Right now our platforms are:
|
Yes, I'd like to do that (and had planned to). I'll add it as an issue (this is still probably #1 ). Might need to go back and forth and see if the platforms you indicated should be one to one with the spec files or if it works out that we need something like: ska2 head spec (I think we still need test capability there) |
Overall good with the concept. I haven't tried this or reviewed the details, I'll basically defer to your testing on this. I'm just thinking about the test matrix / acceptance criteria for this PR. In the list below "tests" refers to the tests being updated or addressed in this PR.
This assumes that some test data may be synced from |
My only go-back is that WRT "No tests are required to pass on GRETA linux 32 or Windows, no need to test.", I think we can run the ACIS checks on linux 32 and for the time being I was planning to use them somewhat as a proxy for just validating xija changes won't have unexpected results on the used models in the Matlab 32 environment (which is, of course, tested separately, but I never mind a heads-up). My preference would be that all "required" test coverage was done from xija built-in tests, but I really haven't looked closely at the coverage there. |
Regarding:
A good question about "desireability". I think maybe being able to run the ska_testr tests (to checkout/certify an environment) is somewhat a normal operational use case.. but I could be convinced either way. |
Generally speaking I think most users (in our organization, and in the wider world of software) do not run system regression tests. So I think of that as a developer-side task, though I understand you probably think most users should be testing. |
I note that I also haven't installed starcheck in the matlab profile, so it sounds like we'll need / want profile / metapackage specific test specs as well as architecture-specific. |
Also worth noting that the running of the regression outputs isn't really a "test", except that the tool runs to completion without unexpected errors in logging. The intent with the regression tests was to diff the outputs which is still a manual step. |
The ACIS load checks presently, like flight starcheck, use recent telemetry for some starting temperatures. On my laptop I don't have fptemp_11, 1dpamzt, 1deamzt, and 1pdeaat sync'd. How should this generalize? Should those be added to the standard ska_sync engineering archive set for syncing? Or encourage a maude transition? |
For the tests, I suppose we could just use a saved seed temperature for each? But I don't know what the test would then show. |
A saved seed temperature seems like a good compromise. The test still would show that 99% of the code is performing as expected. |
There is even an official word for that strategy, mock testing, where you stub interfaces that are not available in testing. |
Makes sense, especially since the non-HEAD versions of these regress output tests are already running the checkers / models in a not-flight configuration. I haven't tried out "--T-init" since they started using acis_thermal_check but will give it a go and finish this up. |
Oops, but it looks like there is no option to turn off the validation plots which want the telemetry too (I tried days=0 but no joy). |
OK. For reference I used this bit of ska_sync.
I think this will still need a uname fix. |
Update starcheck and acis regress outputs
These changes use a more atomic 'ska_testr' data area for source data for running starcheck and add additional acis checker tests compatible with that area. The idea is that can be more portable (GRETA etc) for running the tests, though I haven't run on GRETA just yet.
The new ska_testr data area (which is presently just a test load with a few ACIS-specific files copied into it) would need to be syn'c or installed as a tarball. Other ideas welcome in that regard.
With regard to portability, I also updated the acis regression tests to run twice by default, once in the more portable setup (that cut-down test load area and with the 'sql' state builder) and once in a way that is most like what ACIS uses in load review (against a dir in /data/acis and with the 'acis' state builder). The regress test that will only run on HEAD LAN now has 'head' in the file name so it can be filtered easily on non-HEAD.