Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update Onyx.printMetrics format #89

Merged
merged 7 commits into from
Jul 16, 2021
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
99 changes: 72 additions & 27 deletions lib/decorateWithMetrics.js
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,11 @@ import _ from 'underscore';
*/
let stats = {};

/* For some reason `performance.now()` does not start from `0` but a very large offset
* like `508,080,000` see: https://github.com/facebook/react-native/issues/30069
* Capturing an offset allows us to record start/ends times relative to app launch time */
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Interesting, since it is supposed to be more reliable than Date, but still has this issue? 🤔

Not a blocker, but I'm not sure I'd expect it this to be 0. performance.now() returns elapsed since the time origin (or is supposed to at least).

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah it will never be 0 but the origin is definitely not app launch the benchmark would say something like "Last call finished at: 3940min" which is useless

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Got it that makes more sense to me now. Thanks for the explanation.

const APP_LAUNCH_TIME = performance.now();
Comment on lines +9 to +12
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've found this issue after I switched to the "new" debugging flow today: https://expensify.slack.com/archives/C01GTK53T8Q/p1626422647390900?thread_ts=1626297548.276300&cid=C01GTK53T8Q

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's related to the "old" chrome debugger not being used


/**
* Wraps a function with metrics capturing logic
* @param {function} func
Expand All @@ -20,7 +25,7 @@ function decorateWithMetrics(func, alias = func.name) {
stats[alias] = [];

function decorated(...args) {
const startTime = performance.now();
const startTime = performance.now() - APP_LAUNCH_TIME;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm trying to understand the reliability of this next usage of performance.now() here and which metric this will make more reliable. Is it just the endTime? Since all the other metrics are relative to performance.now() but that one will tell us the actual time from when Onyx started doing work and when it finished for each method?

Copy link
Contributor Author

@kidroca kidroca Jul 16, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This doesn't make anything more or less reliable, just keeps our start/end times relative to app launch

Differences between performance.now() are reliable but the initial time given by performance.now() is ridiculously high - and I can't match the value to anything - it's not the current unix ts, it's not the time since app launch
It's a value like 508 million and increases as time passes. Might be relative to midnight 00:00 UTC ? But that doesn't help us track boot time 😄

On the old flow, when this was provided by Chrome, the value will be very close to 0 initially and so you know a value of 1000 means something started/ended a second past app launch

By capturing our own reference point APP_LAUNCH_TIME we keep start/end times relative to that point
so we have more useful information

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If I remove this, I'll get this ATM: "Last call finished at: 8907.4min" so it's not midnight
It might be the time since last reboot, let me check
Yep seems to be it: "Last call finished at: 10.0min"
I guess it uses some clock managed by the OS and that's what it returns

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Haha, ok awesome, thanks for the explanation.
Very weird, but now that we are interested in something besides relative times also very necessary.


const originalPromise = func.apply(this, args);

Expand All @@ -30,7 +35,7 @@ function decorateWithMetrics(func, alias = func.name) {
* */
originalPromise
.finally(() => {
const endTime = performance.now();
const endTime = performance.now() - APP_LAUNCH_TIME;

if (!_.has(stats, alias)) {
stats[alias] = [];
Expand Down Expand Up @@ -62,9 +67,12 @@ function sum(list, prop) {
}

/**
* Returns total, average time and all captured stats mapped under
* summaries.methodName -> method stats
* @returns {{averageTime: number, summaries: Record<string, Object>, totalTime: number}}
* Aggregates and returns benchmark information
* @returns {{summaries: Record<string, Object>, totalTime: number, lastCompleteCall: *}}
* An object with
* - `totalTime` - total time spent by decorated methods
* - `lastCompleteCall` - millisecond since launch the last call completed at
* - `summaries` - mapping of all captured stats: summaries.methodName -> method stats
*/
function getMetrics() {
const summaries = _.chain(stats)
Expand All @@ -74,30 +82,42 @@ function getMetrics() {
const avg = (total / calls.length) || 0;
const max = _.max(calls, 'duration').duration || 0;
const min = _.min(calls, 'duration').duration || 0;
const lastCall = _.max(calls, 'endTime');

return [methodName, {
methodName,
total,
max,
min,
avg,
lastCall,
calls,
}];
})
.object() // Create a map like methodName -> StatSummary
.value();

const totalTime = sum(_.values(summaries), 'total');
const averageTime = (totalTime / _.size(summaries)) || 0;
kidroca marked this conversation as resolved.
Show resolved Hide resolved
const lastCompleteCall = _.max(_.values(summaries), ['lastCall', 'endTime']).lastCall;

return {
totalTime,
averageTime,
summaries,
lastCompleteCall,
};
}

function toHumanReadableDuration(millis) {
/**
* Convert milliseconds to human readable time
* @param {number} millis
* @param {boolean} [raw=false]
* @returns {string|number}
*/
function toDuration(millis, raw = false) {
if (raw) {
return millis;
}

const minute = 60 * 1000;
if (millis > minute) {
return `${(millis / minute).toFixed(1)}min`;
Expand All @@ -115,30 +135,55 @@ function toHumanReadableDuration(millis) {
* Print extensive information on the dev console
* max, min, average, total time for each method
* and a table of individual calls
*
* @param {boolean} [raw=false] setting this to true will print raw instead of human friendly times
* Useful when you copy the printed table to excel and let excel do the number formatting
kidroca marked this conversation as resolved.
Show resolved Hide resolved
*/
function printMetrics() {
const {totalTime, averageTime, summaries} = getMetrics();
function printMetrics(raw = false) {
const {totalTime, summaries, lastCompleteCall = {endTime: -1}} = getMetrics();
kidroca marked this conversation as resolved.
Show resolved Hide resolved

/* eslint-disable no-console */
console.group('Onyx Benchmark');
console.info(' Total: ', toHumanReadableDuration(totalTime));
console.info(' Average: ', toHumanReadableDuration(averageTime));

_.chain(summaries)
const prettyData = _.chain(summaries)
.filter(method => method.avg > 0)
.sortBy('avg')
.reverse()
.forEach(({calls, methodName, ...summary}) => {
const times = _.map(summary, (value, key) => `${key}: ${toHumanReadableDuration(value)}`);

console.groupCollapsed(`${methodName}\n ${times.join('\n ')} \n calls: ${calls.length}`);
console.table(calls.map(call => ({
startTime: toHumanReadableDuration(call.startTime),
endTime: toHumanReadableDuration(call.endTime),
duration: toHumanReadableDuration(call.duration),
.map(({
calls, methodName, lastCall, ...summary
}) => {
const prettyTimes = _.chain(summary)
.map((value, key) => ([key, toDuration(value, raw)]))
.object()
.value();

const prettyCalls = calls.map(call => ({
startTime: toDuration(call.startTime, raw),
endTime: toDuration(call.endTime, raw),
duration: toDuration(call.duration, raw),
args: JSON.stringify(call.args)
})));
console.groupEnd();
});
}));

return {
methodName,
...prettyTimes,
'time last call completed': toDuration(lastCall.endTime, raw),
calls: calls.length,
prettyCalls,
};
})
.value();

/* eslint-disable no-console */
console.group('Onyx Benchmark');
console.info(' Total: ', toDuration(totalTime, raw));
console.info(' Last call finished at: ', toDuration(lastCompleteCall.endTime, raw));

console.table(prettyData.map(({prettyCalls, ...summary}) => summary));

prettyData.forEach((method) => {
console.groupCollapsed(`[${method.methodName}] individual calls: `);
console.table(method.prettyCalls);
console.groupEnd();
});

console.groupEnd();
/* eslint-enable */
}
Expand Down
4 changes: 2 additions & 2 deletions tests/unit/decorateWithMetricsTest.js
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ describe('decorateWithMetrics', () => {
const metrics = getMetrics();
expect(metrics).toEqual(expect.objectContaining({
totalTime: expect.any(Number),
averageTime: expect.any(Number),
lastCompleteCall: expect.any(Object),
summaries: expect.objectContaining({mockFn: expect.any(Object)}),
}));

Expand Down Expand Up @@ -264,7 +264,7 @@ describe('decorateWithMetrics', () => {
// THEN stats should be empty and the total time 0
expect(result.summaries).toEqual({});
expect(result.totalTime).toEqual(0);
expect(result.averageTime).toEqual(0);
expect(result.lastCompleteCall).not.toBeDefined();
});

it('Should calculate total and average correctly', () => {
Expand Down