-
Notifications
You must be signed in to change notification settings - Fork 47
Survey questions for end users to decide what to benchmark #136
Comments
You got the wrong handle. |
@bmeurer sorry about that. |
Here's a preliminary list of bullet points: These should be text entry to leave the answers open:
Mostly WIP at this point, and targeted towards figuring out what JavaScript focused benchmarks are needed. That doesn't really say a lot about the other performance aspects of Node. |
There was already a survey run earlier this year, which might be related, results were published here. As mentioned in the meeting, these points above are highly focused on the underlying JavaScript engine. It would be interesting if folks with experience/interest in other areas of Node could add questions related to that to the list. |
We are missing a good framework in the ecosystem to microbenchmark I/O operations with statistic significance. Something similar to what we have in core, but something we can use to measure promises, callbacks, database drivers. I had several shot at this, none of them was great. A typical question I get is: "how do async/await stack up against callbacks". Some folks (including myself) have been putting some time in https://github.com/fastify/benchmarks. This is not comprehensive, but the various frameworks do more or less the same things. |
What is your primary use case for Node? (Web Developer Tooling / Standalone Servers / Cloud Services / Others) |
Opened request for help sending out survey nodejs/community-committee#153 |
Survey is live here: https://www.surveymonkey.com/r/NodeBenchmarking |
We have a dump of the responses so far. I'm not going to post them here as I believe we should wait for the poll to close before we make the results public in order to avoid influencing the results. If members of the @nodejs/benchmarking team would like a copy just let me know and I'll send them on through email. |
@mhdawson Can you send me a copy of the responses so far? |
@bmeurer I realized some data might have to be stripped out (for example any identifying info) and have just asked the question I should have asked before which is if any of the data is sensitive. Of course we'll need to share with benchmarking team but I want to agree with the Foundation contact doing the survey how/what we should share. It may take me a bit more time to do that. |
Closing since the survey is complete and we are just waiting for the final data from the user-feedback team. |
Generate list of questions, then work with community committee to see if we can send out survey from Node.js Foundation/community.
@bmeurer will come up with some initial ideas.
The text was updated successfully, but these errors were encountered: