-
Notifications
You must be signed in to change notification settings - Fork 623
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(cache/unstable): add memoize()
and LruCache
#4725
Conversation
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #4725 +/- ##
==========================================
- Coverage 96.37% 96.21% -0.16%
==========================================
Files 465 473 +8
Lines 37506 38354 +848
Branches 5527 5575 +48
==========================================
+ Hits 36147 36903 +756
- Misses 1317 1408 +91
- Partials 42 43 +1 ☔ View full report in Codecov by Sentry. |
This sounds fine to me. |
memoize()
and LruCache
memoize()
and LruCache
memoize()
and LruCache
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great work so far - thorough tests and well thought out. I have few requests for changes and have updated our tools and checks to include @std/cache
. Please let us know if you need any help getting those done.
However, my main concern with the package in it's current state is that it does too much, especially for given this is the initial version. I think we should start simple and add features as we see demand. I hope the costs to the maintainer and user for not doing so are obvious. On that same vain, some thoughts:
options.cache
property looks useful, but it might be better as a private property.- Is
options.getKey
really needed, except for some edge cases? - Is
options.truncateArgs
needed if it mainly only avoids a small/trivial number of keystrokes in array mapper callbacks (and similar)? Might a simple example suffice instead? - Might
options.cacheRejectedPromises
be fine to omit if we have a reasonable default? We can add in the future if there's demand.
Co-authored-by: Asher Gomez <ashersaupingomez@gmail.com>
Co-authored-by: Asher Gomez <ashersaupingomez@gmail.com>
Co-authored-by: Asher Gomez <ashersaupingomez@gmail.com>
Co-authored-by: Asher Gomez <ashersaupingomez@gmail.com>
Co-authored-by: Asher Gomez <ashersaupingomez@gmail.com>
Do you mean it can be supplied as an option but not publicly available as a property of the returned function? That would still allow manual manipulation if a reference was maintained, just not to manipulate it from a reference to the function alone. If so, that sounds reasonable. const myCache = ...
const myFunc = memoize((arg: any) => { /* ... */ }, { cache: myCache })
export { myFunc }
// `myCache` is is no longer available via `myFunc.cache`
export { myCache }
// now it's available via explicit reference
I think const user: User = { id: 1 }
const sameUser: User = { id: 1 }
const getUserDataByReference = memoize((user: User) => { /* ... */ })
getUserDataByReference(user) // computed once
getUserDataByReference(user) // still computed once
getUserDataByReference(sameUser) // now computed twice
const getUserDataById = memoize((user: User) => { /* ... */ }, { getKey: ({ id }) => id)
getUserDataById(user) // computed once
getUserDataById(sameUser) // still computed once
Fair point, I'll remove that and add an example.
Probably the most sensible default is the current one, i.e. to remove promises from the cache upon rejection (after all, if the function threw synchronously, that result obviously wouldn't be cached either). And it can be worked around easily enough by wrapping the result/rejection of the inner function in something akin to |
I see. So based on your feedback, I suggest we do the following:
Does this all sound reasonable? Again, if we see confirmed demand for these functionalities in the future, we can add them in a non-breaking manner. WDYT? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I just realise I hadn't clicked "Submit review" from like a week ago 🤦🏾♂️
Please review once more @kt3k |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM!
Closes #4608
Some questions:
LruCache
? Currently, all three ofget
,set
, andhas
are tracked. In other words, when deleting the "least-recently-used", it means deleting the entry that was least-recently touched by any ofget
,set
, orhas
.memoize(fn)(arg)
to be memoized, with the following gotcha: