-
Notifications
You must be signed in to change notification settings - Fork 30k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
errors: improve performance of NodeError instantiation #49654
errors: improve performance of NodeError instantiation #49654
Conversation
Review requested:
|
Benchmark CI: https://ci.nodejs.org/view/Node.js%20benchmark/job/benchmark-node-micro-benchmarks/1386/ Results
|
Do the tests need to pass to trigger the benchmark? |
no, it's a separate process |
This comment was marked as outdated.
This comment was marked as outdated.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm, looks really good
I had to patch the wpt testharness, because it checks if the constructor is equal the BaseClass. I tried to figure out if I somehow could cheat the logic. It worked in a user land script but not with the NodeError in NodeJs. I worked 2 days on this PR and I tested a lot of variations on how to get it with prototype inheritance, avoid the ObjectDefineProperties and keep the attributes non-enumarable. But no chance. Only by using the proposed ES6 class I could achieve the desires behaviour. Regarding the remaining error I really have no clue why it does not create the desired error message. I am really confused. So this why i also proposed the PR, because I a have no clue what the issue is. I thought maybe I messed up hard and somehow share resources between the errors. But no. So if somebody could help with this, because I really dont know what the issue is. By using an es6 class we use v8s own captureStackTrace Logic and hide the Error with it. It would also mean, that if we use this solution for all errors of node, that we optimize it basically to the maximum. So for more performance we need then to optimize v8. |
Benchmark CI output:
well done |
Note this means Node gives worse errors since captureLargerStackTrace was used to add more stack frames in case Error.stackTraceLimit was set to a low value (which likely means the user would get no meaningful frames from outside of core) I think? I think 656ce92#diff-670bf55805b781d9a3579f8bca9104c04d94af87cc33220149fd7d37b095ca1cR413 is the change that added it #35644 |
My assessment: Now that we dont need to instantiate the base error but directly generate the error instance, the NodeError StackFrame is not created anymore. So we can use the v8 stacktrace functionality directly, without the need to manually limit the stacktrace. The new logic: 'use strict';
const kIsNodeError = Symbol('kIsNodeError');
function makeNodeErrorWithCode(Base, key) {
class NodeError extends Base {
#message = '';
code = key;
constructor(...args) {
super();
this.message = 'bla'
}
get ['constructor']() {
return Base;
}
get [kIsNodeError]() {
return true;
}
get message() {
return this.#message;
}
set message(value) {
this.#message = value;
}
[Symbol.toStringTag] = 'Error'
toString() {
return `${this.name} [${key}]: ${this.message}`;
}
}
Object.setPrototypeOf(NodeError.constructor, Base)
return NodeError;
}
const base = TypeError
const Err = makeNodeErrorWithCode(base, 'ERR_INVALID_URL');
function test() {
try {
test1()
} catch (e) {
console.log(e.stack)
}
}
function test1() {
test2()
}
function test2() {
test3()
}
function test3() {
test4()
}
function test4() {
test5()
}
function test5() {
test6()
}
function test6() {
test7()
}
function test7() {
test8()
}
function test8() {
test9()
}
function test9() {
test10()
}
function test10() {
throw new Err()
}
test() aras@aras-Lenovo-Legion-5-17ARH05H:~/workspace/node$ ./node test.js
TypeError: bla
at test10 (/home/aras/workspace/node/test.js:89:9)
at test9 (/home/aras/workspace/node/test.js:85:3)
at test8 (/home/aras/workspace/node/test.js:81:3)
at test7 (/home/aras/workspace/node/test.js:77:3)
at test6 (/home/aras/workspace/node/test.js:73:3)
at test5 (/home/aras/workspace/node/test.js:69:3)
at test4 (/home/aras/workspace/node/test.js:65:3)
at test3 (/home/aras/workspace/node/test.js:61:3)
at test2 (/home/aras/workspace/node/test.js:57:3)
at test1 (/home/aras/workspace/node/test.js:53:3) vs. the old logic 'use strict';
const kIsNodeError = Symbol('kIsNodeError');
function makeNodeErrorWithCode(Base, key) {
return function NodeError(...args) {
const limit = Error.stackTraceLimit;
Error.stackTraceLimit = 0;
const error = new Base();
// Reset the limit and setting the name property.
Error.stackTraceLimit = limit;
const message = 'bla'
Object.defineProperties(error, {
[kIsNodeError]: {
__proto__: null,
value: true,
enumerable: false,
writable: false,
configurable: true,
},
message: {
__proto__: null,
value: message,
enumerable: false,
writable: true,
configurable: true,
},
toString: {
__proto__: null,
value() {
return `${this.name} [${key}]: ${this.message}`;
},
enumerable: false,
writable: true,
configurable: true,
},
});
Error.captureStackTrace(error);
error.code = key;
return error;
};
}
const base = TypeError
const Err = makeNodeErrorWithCode(base, 'ERR_INVALID_URL');
function test() {
try {
test1()
} catch (e) {
console.log(e.stack)
}
}
function test1() {
test2()
}
function test2() {
test3()
}
function test3() {
test4()
}
function test4() {
test5()
}
function test5() {
test6()
}
function test6() {
test7()
}
function test7() {
test8()
}
function test8() {
test9()
}
function test9() {
test10()
}
function test10() {
throw new Err()
}
Error.stackTraceLimit = 10
test() TypeError: bla
at new NodeError (/home/aras/workspace/node/test.js:39:11)
at test10 (/home/aras/workspace/node/test.js:94:9)
at test9 (/home/aras/workspace/node/test.js:90:3)
at test8 (/home/aras/workspace/node/test.js:86:3)
at test7 (/home/aras/workspace/node/test.js:82:3)
at test6 (/home/aras/workspace/node/test.js:78:3)
at test5 (/home/aras/workspace/node/test.js:74:3)
at test4 (/home/aras/workspace/node/test.js:70:3)
at test3 (/home/aras/workspace/node/test.js:66:3)
at test2 (/home/aras/workspace/node/test.js:62:3) |
Can you please help me, regarding this PR? I cant figure out why ERR_NOT_SUPPORTED_IN_SNAPSHOT is not having the correct message. |
Is this better? |
Should the other errors also become enumerable message attributes? ERR_NOT_BUILDING_SNAPSHOT |
e4365ae
to
a3c2d60
Compare
Commit Queue failed- Loading data for nodejs/node/pull/49654 ✔ Done loading data for nodejs/node/pull/49654 ----------------------------------- PR info ------------------------------------ Title errors: improve performance of NodeError instantiation (#49654) Author Aras Abbasi (@Uzlopak) Branch Uzlopak:improve-perf-error-creation -> nodejs:main Labels lib / src, performance, needs-ci, needs-benchmark-ci, commit-queue-squash Commits 1 - errors: improve performance of instantiation Committers 1 - uzlopak PR-URL: https://github.com/nodejs/node/pull/49654 Reviewed-By: Matteo Collina Reviewed-By: Yagiz Nizipli Reviewed-By: Stephen Belanger Reviewed-By: Joyee Cheung ------------------------------ Generated metadata ------------------------------ PR-URL: https://github.com/nodejs/node/pull/49654 Reviewed-By: Matteo Collina Reviewed-By: Yagiz Nizipli Reviewed-By: Stephen Belanger Reviewed-By: Joyee Cheung -------------------------------------------------------------------------------- ⚠ Commits were pushed since the last approving review: ⚠ - errors: improve performance of instantiation ℹ This PR was created on Thu, 14 Sep 2023 19:12:31 GMT ✔ Approvals: 4 ✔ - Matteo Collina (@mcollina) (TSC): https://github.com/nodejs/node/pull/49654#pullrequestreview-1628257155 ✔ - Yagiz Nizipli (@anonrig) (TSC): https://github.com/nodejs/node/pull/49654#pullrequestreview-1642822449 ✔ - Stephen Belanger (@Qard): https://github.com/nodejs/node/pull/49654#pullrequestreview-1633759369 ✔ - Joyee Cheung (@joyeecheung) (TSC): https://github.com/nodejs/node/pull/49654#pullrequestreview-1640186686 ✔ Last GitHub CI successful ℹ Last Benchmark CI on 2023-09-25T21:46:15Z: https://ci.nodejs.org/view/Node.js%20benchmark/job/benchmark-node-micro-benchmarks/1419/ ℹ Last Full PR CI on 2023-09-28T09:00:48Z: https://ci.nodejs.org/job/node-test-pull-request/54318/ - Querying data for job/node-test-pull-request/54318/ ✔ Last Jenkins CI successful -------------------------------------------------------------------------------- ✔ Aborted `git node land` session in /home/runner/work/node/node/.ncuhttps://github.com/nodejs/node/actions/runs/6337251698 |
Ah come on... |
Landed in cc725a6 |
YEAH |
well done @Uzlopak ! |
Thank you. Next step is improving perf of SystemError |
PR-URL: nodejs#49654 Reviewed-By: Matteo Collina <matteo.collina@gmail.com> Reviewed-By: Yagiz Nizipli <yagiz@nizipli.com> Reviewed-By: Stephen Belanger <admin@stephenbelanger.com> Reviewed-By: Joyee Cheung <joyeec9h3@gmail.com> Reviewed-By: Raz Luvaton <rluvaton@gmail.com>
PR-URL: #49654 Reviewed-By: Matteo Collina <matteo.collina@gmail.com> Reviewed-By: Yagiz Nizipli <yagiz@nizipli.com> Reviewed-By: Stephen Belanger <admin@stephenbelanger.com> Reviewed-By: Joyee Cheung <joyeec9h3@gmail.com> Reviewed-By: Raz Luvaton <rluvaton@gmail.com>
PR-URL: nodejs#49654 Reviewed-By: Matteo Collina <matteo.collina@gmail.com> Reviewed-By: Yagiz Nizipli <yagiz@nizipli.com> Reviewed-By: Stephen Belanger <admin@stephenbelanger.com> Reviewed-By: Joyee Cheung <joyeec9h3@gmail.com> Reviewed-By: Raz Luvaton <rluvaton@gmail.com>
Well. I dont know why captureLargerStackTrace exists.
Maybe my approach is wrong. I dont know if I lack some node specific knowledge.
I tried to close the gap between the v8 Error class and NodeError. If we really need to support the instantiation of Errors without new, we could wrap it in function. But on the other hand, other Errors are also class type.
The tests basically pass locally, except /test/parallel/test-snapshot-worker.js. Maybe I messed something up (hope not :D)
So please dont be too harsh. 🫣
benchmarks:
Main:
PR
So before we have about 170k and with the PR we have 413k. So we gain what? 240%?