-
-
Notifications
You must be signed in to change notification settings - Fork 3.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
refactor(InfiniteLoader.js): replace inefficient abuse of reduce #1277
Conversation
Unless VMs specifically optimize this, doing `concat` inside of `reduce` is wasteful because it creates an array of length 1, then an array of length 2, etc. costing `O(n^2)` space and time for something that should take `O(n)`.
No CI? 😱 |
hm. There are some CI issues with pull requests not coming from the source repo. But ill test locally |
reduced.concat([unloadedRange.startIndex, unloadedRange.stopIndex]), | ||
[], | ||
); | ||
const squashedUnloadedRanges = [].concat(...unloadedRanges.map( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm confused why can't we just do
const squashedUnloadedRanges = unloadedRanges.map(....
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A little-known feature of concat
is that it flattens top-level array arguments:
> [].concat([1,2], [3,4])
(4) [1, 2, 3, 4]
unloadedRanges.map(...)
returns an array of arrays rather than a flattened array.
This is important to know for anytime you need to flatten one level and don't want to depend on any library or utility code.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh I see. TIL.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Depending on unlimited varargs is not a good idea. The maximum number of arguments is context and implementation dependent.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is that true even when using Array.prototype.concat.apply([], unloadedRanges)
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yup nevermind, PR coming in a moment
Unless VMs specifically optimize this, doing
concat
inside ofreduce
is wasteful because it creates an array of length 1, then an array of length 2, etc. costingO(n^2)
time and space.[].concat(...stuff.map(...))
does the same thing inO(n)
time and space.Thanks for contributing to react-virtualized!
Before submitting a pull request, please complete the following checklist:
npm test
) all passnpm run prettier
).npm run typecheck
).Here is a short checklist of additional things to keep in mind before submitting: