Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enable GitHub CI (testing) workflows #9

Open
jmckenna opened this issue Apr 1, 2021 · 9 comments
Open

Enable GitHub CI (testing) workflows #9

jmckenna opened this issue Apr 1, 2021 · 9 comments

Comments

@jmckenna
Copy link

jmckenna commented Apr 1, 2021

  • for all readers: "CI" = continuous integration testing, so, when someone makes a request to add to the ZOO-Project source code, triggers are automatically enabled to test the new proposed code on various build systems (Ubuntu, Clang, MinGW, etc).
  • previously all Open Source projects used TravisCI
  • recently a lot of projects (long story) have moved to GitHub "workflows" or "actions" for CI testing
  • see the long list of configured tests now by GDAL at https://github.com/OSGeo/gdal/tree/master/.github/workflows

This CI testing is critical for a project. This is a must, as part of our "GitHub migration" plans.

@gfenoy
Copy link
Contributor

gfenoy commented Apr 1, 2021

In ZOO-Projecct we used appVeyor for years to check if build procedure work there https://ci.appveyor.com/project/djay/trunk-t67ri and https://ci.appveyor.com/project/djay/trunk-hp3tw.

On the other hand, the Docker image that can be build using the docker-compose.yml may also be another way to both test, build and publish the docker binary image.

It is probably a good starting point for building and testing.

I would personally be in favor of the second option.

@omshinde
Copy link

omshinde commented Apr 2, 2021

I second using Github Actions and Docker. Found this which might be helpful for our reference: https://docs.github.com/en/actions/creating-actions/creating-a-docker-container-action.

@gfenoy
Copy link
Contributor

gfenoy commented Jul 23, 2021

I think that this issue ZOO-Project/ZOO-Project#2 is related to this one.

@gfenoy
Copy link
Contributor

gfenoy commented Jul 23, 2021

I am wondering if the documentation should not also be generated using GitHub action.

If we go this way then, we can also imagine the same kind of generation / publication for workshop materials for instance.

@omshinde
Copy link

omshinde commented Jul 23, 2021

It would be great to automatically generate documentation using Github Action. I am curious to learn about this. :-)

@gfenoy
Copy link
Contributor

gfenoy commented Aug 25, 2021

During the last OGC Code Sprint, @samsouk and I had published a first GitHub Action for producing and publishing the docker image. Please see here for more details.

Yesterday, I started integrating tests by using the cptesting we made available some years ago, within the same GitHub Action.

The issue by now is that, the test script is producing HTML output but it would then require publication of the resulting HTML pages to ease the read. So, to me there are two options:

  1. Store the produced HTML pages, then aggregate them to produce a readable output online to be checked to ensure that the tests passed all or some failed,
  2. Change the test script for outputting only simple text showing the same kind of information that is already provided.

@gfenoy
Copy link
Contributor

gfenoy commented Aug 26, 2021

I have integrated the support of local cache when building the Docker image. Also, now the testing for GetCapabilities and DescribeProcess are run for both 1.0.0 and. 2.0.0 versions.

Still, the test script would benefit of being rewritten in a way that it would fail in case any unexpected behavior is found rather than having to be checked by us to verify the produced test outputs.

Any idea?

@omshinde
Copy link

I have integrated the support of local cache when building the Docker image. Also, now the testing for GetCapabilities and DescribeProcess are run for both 1.0.0 and. 2.0.0 versions.

Still, the test script would benefit of being rewritten in a way that it would fail in case any unexpected behavior is found rather than having to be checked by us to verify the produced test outputs.

Any idea?

@gfenoy It would be nice to have test scripts instead of checking it everytime it is compiled. I was thinking what could be "unexpected behavior". Can we check for status code or the received response for GetCapabilities request?

@gfenoy
Copy link
Contributor

gfenoy commented Aug 26, 2021

@omshinde actually there are multiple tests running already starting from line 54. Not only triggering GetCapabilities but, checking the response is conform to the schemas is also handled there. Same occur for DescribeProcess, for this two first requests we get support for version 1.0.0 and 2.0.0.

For the ExecuteSync and ExecuteAsync test cases only version 1.0.0 is supported until now. Some work should be handled to support 2.0.0, specifically the creation of sample XML Execute content to be posted to the server for testing.

To answer your question about "unexpected behavior" can be that the Execute request that should return status code 200 does return something else. Another example would be, in case you provide a wrong value for a parameter then you should receive an exception report with a given status code.

Finally, for completeness it would be worth it considering "Annex A: Abstract test suite" and double check that the cptesting testing tool is following every single test from the "Server test module" (A3). Maybe, to add clarity, we can also plan to provide a reference for every tests from the A3 section and display their names right before the test results.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants