Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: add ability to specify entryType #266

Open
wants to merge 2 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 3 additions & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,9 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](http://keepachangelog.com/) and this
project adheres to [Semantic Versioning](http://semver.org/).

<!-- ## Unreleased -->
## Unreleased

- Added the optional `entryType` option to specify the `entryType` of a [`PerformanceEntry`](https://developer.mozilla.org/en-US/docs/Web/API/PerformanceEntry).

## [0.7.1] 2024-07-18

Expand Down
15 changes: 15 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -252,6 +252,21 @@ The following performance entry types are supported:
Retrieve the `startTime` of a built-in paint measurement (e.g.
`first-contentful-paint`).

If you have multiple entries with the same name (for example, a `mark` and `measure` both named `foo`),
then you can specify a particular `entryType` to narrow it down:

```json
"benchmarks": [
{
"measurement": {
"mode": "performance",
"entryName": "foo",
"entryType": "measure"
}
}
]
```

#### Callback

By default with local (non-URL) benchmarks, or when the `--measure` flag is set
Expand Down
8 changes: 8 additions & 0 deletions config.schema.json
Original file line number Diff line number Diff line change
Expand Up @@ -376,6 +376,14 @@
"entryName": {
"type": "string"
},
"entryType": {
"enum": [
"mark",
"measure",
"paint"
],
"type": "string"
},
"mode": {
"enum": [
"performance"
Expand Down
7 changes: 6 additions & 1 deletion src/measure.ts
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,12 @@ async function queryForPerformanceEntry(
): Promise<number | undefined> {
const escaped = escapeStringLiteral(measurement.entryName);
const script = `return window.performance.getEntriesByName(\`${escaped}\`);`;
const entries = (await driver.executeScript(script)) as PerformanceEntry[];
let entries = (await driver.executeScript(script)) as PerformanceEntry[];
if (typeof measurement.entryType === 'string') {
entries = entries.filter(
(entry) => entry.entryType === measurement.entryType
);
}
if (entries.length === 0) {
return undefined;
}
Expand Down
19 changes: 19 additions & 0 deletions src/test/data/performance-measure-specific-entry-type.html
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
<!DOCTYPE html>
<!--
This test benchmark sets a performance API measurement after a delay.
-->
<html>
<head>
<title>performance measure test with specific entryType</title>
</head>
<body>
<script>
const params = new URL(document.location).searchParams;
const wait = Number(params.get('wait'));
const start = performance.mark('foo');
setTimeout(() => {
performance.measure('foo', 'foo');
}, wait);
</script>
</body>
</html>
33 changes: 33 additions & 0 deletions src/test/data/performance-measure-specific-entry-type.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
{
"$schema": "https://raw.githubusercontent.com/Polymer/tachometer/master/config.schema.json",
"sampleSize": 10,
"timeout": 0,
"benchmarks": [
{
"name": "20",
"url": "performance-measure-specific-entry-type.html?wait=20",
"measurement": {
"mode": "performance",
"entryName": "foo",
"entryType": "measure"
},
"browser": {
"name": "chrome",
"headless": true
}
},
{
"name": "60",
"url": "performance-measure-specific-entry-type.html?wait=60",
"measurement": {
"mode": "performance",
"entryName": "foo",
"entryType": "measure"
},
"browser": {
"name": "chrome",
"headless": true
}
}
]
}
35 changes: 35 additions & 0 deletions src/test/e2e_test.ts
Original file line number Diff line number Diff line change
Expand Up @@ -226,6 +226,41 @@ suite('e2e', function () {
})
);

test(
'performance entry with specific entryType',
hideOutput(async function () {
const delayA = 20;
const delayB = 60;

// TODO(aomarks) This isn't actually testing each browser, since
// the browser is hard coded in the config file. Generate the JSON
// file dynamically instead.
const argv = [
`--config=${path.join(
testData,
'performance-measure-specific-entry-type.json'
)}`,
];

const actual = await main(argv);
assert.isDefined(actual);
assert.lengthOf(actual!, 2);
const [a, b] = actual!;
const diffAB = a.differences[1]!;
const diffBA = b.differences[0]!;

// We can't be very precise with expectations here, since
// setTimeout can be quite variable on a resource starved machine
// (e.g. some of our CI builds).
assert.isAtLeast(a.stats.mean, delayA);
assert.isAtLeast(b.stats.mean, delayB);
assert.isBelow(ciAverage(diffAB.absolute), 0);
assert.isAbove(ciAverage(diffBA.absolute), 0);
assert.isBelow(ciAverage(diffAB.relative), 0);
assert.isAbove(ciAverage(diffBA.relative), 0);
})
);

test(
'multiple measurements',
hideOutput(async function () {
Expand Down
1 change: 1 addition & 0 deletions src/types.ts
Original file line number Diff line number Diff line change
Expand Up @@ -87,6 +87,7 @@ export interface CallbackMeasurement extends MeasurementBase {
export interface PerformanceEntryMeasurement extends MeasurementBase {
mode: 'performance';
entryName: string;
entryType?: 'mark' | 'measure' | 'paint';
}

export interface ExpressionMeasurement extends MeasurementBase {
Expand Down