-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: recording verification rule #128
Conversation
@@ -21,7 +22,7 @@ export function createNewGeneratorFile(recordingPath = ''): GeneratorFileData { | |||
testData: { | |||
variables: [], | |||
}, | |||
rules: [], | |||
rules: [createEmptyRule('recording-verification')], |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this rule will be present on every newly created generator as a default that can be removed
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sweet! 🤩 Works as advertised and adds checks with correct status codes 👌
Few comments:
- Not a blocker but would be nice to have a test case covering this in codegen.test.ts
- Should we terminate the test when check fails? Currently, there's no indication that checks failed, but I'm guessing it will be added as another task
According to k6 docs on checks they are not supposed to be make the test fail. Regarding the missing inidication, that's because check information seems to be given only on end of test summary, I was already thinking that we might need a way to retrieve them and show in the Validator but we do not get them for free :( |
It makes in context of performance testing, but using Validator I would expect it to stop when something goes wrong, allowing me to see which request breaks it. But I see 2 possible solutions here:
WDYT? Those suggestions are not to be part of that PR though, just ideas for future improvements. |
On the immediate future I think we might get checks information only at the end of the test run 🤔 This would exclude option 2 short-term For option 1, I was looking into how to make them fail and I think we can combine them with Thresholds, possibly checking the rate of error for |
Yeah, that's what we do for e2e tests
|
Re-requesting review as I've added a way to retrieve checks information to showcase in the Validator. There is one big issue, this can be brittle, if the user uses its own version of |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
After looking into this some more - currently we are generating a check per every status code checked, it looks something like this:
✗ is status 200
↳ 53% — ✓ 758 / ✗ 665
✗ is status 401
↳ 26% — ✓ 14 / ✗ 38
✗ is status 204
↳ 0% — ✓ 0 / ✗ 24
✗ is status 403
↳ 0% — ✓ 0 / ✗ 23
Which might not be very helpful. Perhaps we should name the check "response status code matches recording" so all checks generated by verification rule are grouped under a single check? Another option would be adding request url to the check name, but that might generate too many checks 🤔
|
||
const verificationSnippet = ` | ||
check(resp, { | ||
'Recording Verification Rule: status matches recording': (r) => r.status === ${response.statusCode}, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
const handleSummaryIndex = scriptLines.findIndex( | ||
(line) => | ||
// NOTE: if the custom handle summary is commented out we can still insert our snippet | ||
// this check should be improved | ||
line.includes('export function handleSummary(') && !line.includes('//') | ||
) | ||
|
||
// NOTE: checks works only if the user doesn't define a custom summary handler | ||
// if no custom handleSummary is defined we add our version to retrieve checks | ||
if (handleSummaryIndex === -1) { | ||
scriptLines.push(checksSnippet) | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We only add the checks retrieval logic via handleSummary
if there is no custom summary defined by the user since we retrieve that information via summary
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM! 🙌
Closes https://github.com/grafana/k6-cloud/issues/2530
This PR adds the "backend" logic for the Recording Verification Rule.
This rule will add checks to compare that the status codes from the responses match the ones from the recording.
I've added this as a separate new rules because it's very specific and it seems to make sense but this is open for debate if it should be part of a bigger
Verification
rule that handles user defined verification + this global recording check 🤔