Skip to content
This repository has been archived by the owner on Oct 11, 2024. It is now read-only.

add test coverage table to github summary #368

Closed
wants to merge 18 commits into from

Conversation

derekk-nm
Copy link

@derekk-nm derekk-nm commented Jul 8, 2024

This PR introduces the creation of a JSON form of the test coverage information that has only been generated as HTML up until this point. The JSON content is then parsed by a script to generate a table that is compatible with github-markdown so that it can be included in the test job summary. The table shows the overall test coverage, and the coverage provided by each sub-directory at the level just below vllm.
This nm remote push job demonstrates that the code changes are working (no failures), and shows the expected test coverage summary table (on the summary page, scroll down to the TEST summary), and JSON artifact (on the summary page, scroll down to the artifacts listing and look for the files starting with cc-vllm-json-*.

Also included are some additional functions that will be excluded/omitted from the test coverage. see pyproject.toml

derekk-nm added 17 commits July 3, 2024 19:19
collects code coverage report as JSON to upload to build artifacts, then parses the file to generate a markdown table for presentation in the GitHub summary.

Removes

Temporarily limits test cases and platforms in nm-remote-push, and skips benchmark, etc in nm-build-test to make testing go faster.
* proper reference to input variable
* install required tabulate package
* fix regex for exclude_also in coverage report
temporarily restrict tests to one file for debugging
may be easier than passing the multi-line table from one action to another.
this time with correction.
this time with correction.
and aligning columns
and revert changes used for testing purposes.
Copy link
Member

@andy-neuma andy-neuma left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

looking good

pip3 install tabulate
# As a multiline response we cannot pass the table directly to github
# so redirect it to a file, then cat the file to the output
python3 ./.github/scripts/coverage_report_breakdown.py ${{ inputs.coverage_json }} > COVERAGE_MD
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the table ends up looking nice.

args = parser.parse_args()
cc = CodeCoverage(Path(args.coverage_json_file))

print(cc.to_github_markdown())
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i like that you are outputting to a file. this makes it easy to run locally.

ignore_index=True)
# clean up the `nan` values for display purposes
summary_df = summary_df.astype(str)
summary_df.replace({"nan": None}, inplace=True)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

quick question, how do we generate "nan"?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this was "fun" 😉 . to get that empty row and header row between the overall info and the per-sub-directory breakdown I needed to add that empty_row_df and header_row_df. To get the concat to work successfully I needed to populate those with pd.NA, (which is nan). once the concat was done, I could clean up those nans

our test runs won't ever enable it
Copy link
Member

@andy-neuma andy-neuma left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

looks good to me.

@derekk-nm
Copy link
Author

moved to nm-vllm-ent

@derekk-nm derekk-nm closed this Aug 1, 2024
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants