You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
My relevant setup is a Ryzen 7950X3D, and a Sabrent Rocket NVMe SSD.
Initially I got that Turbo is almost twice as fast as NX.
The benchmark itself showed me a very interesting thing: Turbo consistently ran the cached output around ~380ms with very little deviance, and NX did ~880ms and ~420ms runs one after the other. ~880ms, ~420ms, ~880ms, ~420ms, ~880ms, ~420ms... and so on. That seems odd.
But since these tests are running back-to-back and hit the filesystem, I figured it's something IO related, so I modified the tests a tiny bit and added a sleep between each run to let everything settle. I had to increase this sleep all the way to 5 seconds to get NX to give me as consistent results as Turbo. But after this NX managed to finish around ~430ms.
Anyway, I hope no one wants to settle on a tool based on minute performance differences. The whole experience around it is much more important. For example I don't like that for the NX example every package has to have a nx entry in the package.json file that contains workspace relative paths instead of package relative paths, it's just one more thing to update when you want to move/rename a package.
The text was updated successfully, but these errors were encountered:
Initially I got that Turbo is almost twice as fast as NX.
The benchmark itself showed me a very interesting thing: Turbo consistently ran the cached output around ~380ms with very little deviance, and NX did ~880ms and ~420ms runs one after the other. ~880ms, ~420ms, ~880ms, ~420ms, ~880ms, ~420ms... and so on. That seems odd.
But since these tests are running back-to-back and hit the filesystem, I figured it's something IO related, so I modified the tests a tiny bit and added a sleep between each run to let everything settle. I had to increase this sleep all the way to 5 seconds to get NX to give me as consistent results as Turbo. But after this NX managed to finish around ~430ms.
Anyway, I hope no one wants to settle on a tool based on minute performance differences. The whole experience around it is much more important. For example I don't like that for the NX example every package has to have a
nx
entry in thepackage.json
file that contains workspace relative paths instead of package relative paths, it's just one more thing to update when you want to move/rename a package.The text was updated successfully, but these errors were encountered: