Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

📖 Add documentation about probes and contributing #3762

Merged
merged 12 commits into from
Jan 23, 2024
29 changes: 29 additions & 0 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,10 @@ project. This document describes the contribution guidelines for the project.
* [How to build scorecard locally](#how-to-build-scorecard-locally)
* [PR Process](#pr-process)
* [What to do before submitting a pull request](#what-to-do-before-submitting-a-pull-request)
* [Changing Score Results](#changing-score-results)
* [Linting](#linting)
* [Permission for GitHub personal access tokens](#permission-for-github-personal-access-tokens)
* [Adding New Probes](#adding-new-probes)
* [Where the CI Tests are configured](#where-the-ci-tests-are-configured)
* [dailyscore-cronjob](#dailyscore-cronjob)
* [Deploying the cron job](#deploying-the-cron-job)
Expand Down Expand Up @@ -126,6 +129,9 @@ assumed to match the PR. For instance, if you have a bugfix in with a breaking
change, it's generally encouraged to submit the bugfix separately, but if you
must put them in one PR, you should mark the whole PR as breaking.

When a maintainer reviews your code, it is generally preferred to solve each individual
review with small fixes without rebasing, so the maintainer can assess each fix separately.

## What to do before submitting a pull request

Following the targets that can be used to test your changes locally.
Expand All @@ -139,6 +145,25 @@ Make sure to signoff your commits before submitting a pull request.

https://docs.pi-hole.net/guides/github/how-to-signoff/

When developing locally, the following commands are useful to run regularly to check unit tests and linting.

| Command | Description | Is called in the CI? |
| make unit-test | Runs unit tests only. `make all` will also run this. | yes |
| make check-linter | Checks linter issues only. `make all` will also run this. | yes |

## Changing Score Results

As a general rule of thumb, pull requests that change Scorecard score results will need a good reason to do so to get merged.
AdamKorcz marked this conversation as resolved.
Show resolved Hide resolved
It is a good idea to discuss such changes in a GitHub issue before implementing them.

## Linting

Most linter issues can be fixed with `golangci-lint` with the following command:

```
make fix-linter
```

## Permission for GitHub personal access tokens

The personal access token need the following scopes:
Expand Down Expand Up @@ -168,6 +193,10 @@ Commit the changes, and submit a PR and scorecard would start scanning in subseq
See [checks/write.md](checks/write.md).
When you add new checks, you need to also update the docs.

## Adding New Probes

See [probes/README.md](probes/README.md) for information about the probes.

## Updating Docs

A summary for each check needs to be included in the `README.md`.
Expand Down
5 changes: 5 additions & 0 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -94,6 +94,11 @@ check-linter: | $(GOLANGCI_LINT)
# Run golangci-lint linter
$(GOLANGCI_LINT) run -c .golangci.yml

fix-linter: ## Install and run golang linter, with fixes
fix-linter: | $(GOLANGCI_LINT)
# Run golangci-lint linter
$(GOLANGCI_LINT) run -c .golangci.yml --fix

add-projects: ## Adds new projects to ./cron/internal/data/projects.csv and ./cron/internal/data/gitlab-projects.csv
add-projects: ./cron/internal/data/projects.csv | build-add-script
# GitHub
Expand Down
70 changes: 70 additions & 0 deletions probes/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,70 @@
# Scorecard probes

This directory contains all the Scorecard probes.

A probe is an assessment of a focused, specific heuristic typically isolated to a particular ecosystem. For example, Scorecards fuzzing check consists of many different probes that assess particular ecosystems or aspects of fuzzing.

Each probe has its own directory in `scorecard/probes`. The probes follow a camelcase naming convention that describe the exact heuristic a particular probe assesses.

Probes can return multiple or a single finding, where a finding is a piece of data with an outcome, message, and optionally a location. Probes should be designed in such a way that a `finding.OutcomePositive` reflects a positive result, and `finding.OutcomeNegative` reflects a negative result. Scorecard has other `finding.Outcome` types available for other results; For example, the `finding.OutcomeNotAvailable` is often used for scenarios, where Scorecard cannot assess a project with a given probe. In addition, probes should also be named in such a way that they answer "yes" or "no", and where "yes" answers positively to the heuristic, and "no" answers negatively. For example, probes that check for SAST tools in the CI are called `toolXXXInstalled` so that `finding.OutcomePositive` reflects that it is positive to use the given tool, and that "yes" reflects what Scorecard considers the positive outcome. For some probes, this can be a bit trickier to do; The `notArchived` probe checks whether a project is archived, however, Scorecard considers archived projects to be negative, and the probe cannot be called `isArchived`. These naming conventions are not hard rules but merely guidelines. Note that probes do not do any formal evaluation such a scoring; This is left to the evaluation part once the outcomes have been produced by the probes.

A probe consists of three files:

- `def.yml`: The documentation of the probe.
- `impl.go`: The actual implementation of the probe.
- `impl_test.go`: The probes test.

## Reusing code in probes

When multiple probes use the same code, the reused code can be placed on `scorecard/probes/internal/utils`

## How do I know which probes to add?

In general, browsing through the Scorecard GitHub issues is the best way to find new probes to add. Requests for support for new tools, fuzzing engines or other heuristics can often be converted into specific probes.

## Probe definition formatting

Probe definitions can display links following standard markdown format.

Probe definitions can display dynamic content. This requires modifications in `def.yml` and `impl.go` and in the evaluation steps.

The following snippet in `def.yml` will display dynamic data provided by `impl.go`:

```md
${{ metadata.dataToDisplay }}
```
AdamKorcz marked this conversation as resolved.
Show resolved Hide resolved

And then in `impl.go` add the following metadata:

```golang
f, err := finding.NewWith(fs, Probe,
"Message", nil,
finding.OutcomePositive)
f = f.WithRemediationMetadata(map[string]string{
"dataToDisplay": "this is the text we will display",
})
```

### Example
Consider a probe with following line in its `def.yml`:
```
The project ${{ metadata.oss-fuzz-integration-status }} integrated into OSS-Fuzz.
```

and the probe sets the following metadata:
```golang
f, err := finding.NewWith(fs, Probe,
"Message", nil,
finding.OutcomePositive)
f = f.WithRemediationMetadata(map[string]string{
"oss-fuzz-integration-status": "is",
})
```

The probe will then output the following text:
```
The project is integrated into OSS-Fuzz.
```

### Should the changes be in the probe or the evaluation?
The remediation data must be set in the probe.
Loading