Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(cache/unstable): add memoize() and LruCache #4725

Merged
merged 22 commits into from
Aug 8, 2024
Merged

Conversation

lionel-rowe
Copy link
Contributor

@lionel-rowe lionel-rowe commented May 14, 2024

Closes #4608

Some questions:

  • Which methods should count as accessing/using an entry stored in an LruCache? Currently, all three of get, set, and has are tracked. In other words, when deleting the "least-recently-used", it means deleting the entry that was least-recently touched by any of get, set, or has.
  • Is "on-the-fly" usage even worth supporting? It adds a small amount of complexity to the implementation and allows memoize(fn)(arg) to be memoized, with the following gotcha:
const fn = (n: number) => n + 1

// ✅ reference to function
const fn1 = memoize(fn)
const result1 = fn1(42)

// ✅ function literal
const fn2 = memoize((n: number) => n + 1)
const result2 = fn2(42)

// ✅ on-the-fly with reference to function
const result3 = memoize(fn)(42)

// ⚠️ on-the-fly with function literal - fails to memoize
const result4 = memoize((n: number) => n + 1)(42)

@lionel-rowe lionel-rowe requested a review from kt3k as a code owner May 14, 2024 08:19
Copy link

codecov bot commented May 14, 2024

Codecov Report

Attention: Patch coverage is 96.64430% with 5 lines in your changes missing coverage. Please review.

Project coverage is 96.21%. Comparing base (714fccb) to head (62fc155).
Report is 49 commits behind head on main.

Files Patch % Lines
cache/_serialize_arg_list.ts 92.18% 4 Missing and 1 partial ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main    #4725      +/-   ##
==========================================
- Coverage   96.37%   96.21%   -0.16%     
==========================================
  Files         465      473       +8     
  Lines       37506    38354     +848     
  Branches     5527     5575      +48     
==========================================
+ Hits        36147    36903     +756     
- Misses       1317     1408      +91     
- Partials       42       43       +1     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

cache/memoize.ts Outdated Show resolved Hide resolved
cache/memoize.ts Outdated Show resolved Hide resolved
cache/memoize.ts Outdated Show resolved Hide resolved
cache/memoize.ts Show resolved Hide resolved
@kt3k
Copy link
Member

kt3k commented May 14, 2024

Currently, all three of get, set, and has are tracked.

This sounds fine to me.

cache/deno.json Outdated Show resolved Hide resolved
cache/memoize.ts Show resolved Hide resolved
cache/memoize.ts Outdated Show resolved Hide resolved
cache/memoize.ts Show resolved Hide resolved
@iuioiua iuioiua changed the title feat(cache): add memoize and LruCache feat(cache): add memoize() and LruCache May 22, 2024
@kt3k kt3k changed the title feat(cache): add memoize() and LruCache feat(cache/unstable): add memoize() and LruCache Jul 18, 2024
Copy link
Member

@kt3k kt3k left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link
Contributor

@iuioiua iuioiua left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great work so far - thorough tests and well thought out. I have few requests for changes and have updated our tools and checks to include @std/cache. Please let us know if you need any help getting those done.

However, my main concern with the package in it's current state is that it does too much, especially for given this is the initial version. I think we should start simple and add features as we see demand. I hope the costs to the maintainer and user for not doing so are obvious. On that same vain, some thoughts:

  1. options.cache property looks useful, but it might be better as a private property.
  2. Is options.getKey really needed, except for some edge cases?
  3. Is options.truncateArgs needed if it mainly only avoids a small/trivial number of keystrokes in array mapper callbacks (and similar)? Might a simple example suffice instead?
  4. Might options.cacheRejectedPromises be fine to omit if we have a reasonable default? We can add in the future if there's demand.

cache/memoize.ts Outdated Show resolved Hide resolved
cache/memoize.ts Outdated Show resolved Hide resolved
cache/memoize.ts Outdated Show resolved Hide resolved
cache/memoize.ts Outdated Show resolved Hide resolved
cache/memoize.ts Outdated Show resolved Hide resolved
cache/lru_cache.ts Show resolved Hide resolved
cache/mod.ts Show resolved Hide resolved
cache/memoize.ts Outdated Show resolved Hide resolved
cache/lru_cache.ts Show resolved Hide resolved
cache/memoize.ts Outdated Show resolved Hide resolved
lionel-rowe and others added 5 commits August 1, 2024 12:02
Co-authored-by: Asher Gomez <ashersaupingomez@gmail.com>
Co-authored-by: Asher Gomez <ashersaupingomez@gmail.com>
Co-authored-by: Asher Gomez <ashersaupingomez@gmail.com>
Co-authored-by: Asher Gomez <ashersaupingomez@gmail.com>
Co-authored-by: Asher Gomez <ashersaupingomez@gmail.com>
@lionel-rowe
Copy link
Contributor Author

@iuioiua

  1. options.cache property looks useful, but it might be better as a private property.

Do you mean it can be supplied as an option but not publicly available as a property of the returned function? That would still allow manual manipulation if a reference was maintained, just not to manipulate it from a reference to the function alone. If so, that sounds reasonable.

const myCache = ...
const myFunc = memoize((arg: any) => { /* ... */ }, { cache: myCache })
export { myFunc }
// `myCache` is is no longer available via `myFunc.cache`
export { myCache }
// now it's available via explicit reference
  1. Is options.getKey really needed, except for some edge cases?

I think getKey is essential, otherwise caching with non-primitive arguments is always by reference (the default).

const user: User = { id: 1 }
const sameUser: User = { id: 1 }
const getUserDataByReference = memoize((user: User) => { /* ... */ })

getUserDataByReference(user) // computed once
getUserDataByReference(user) // still computed once
getUserDataByReference(sameUser) // now computed twice

const getUserDataById = memoize((user: User) => { /* ... */ }, { getKey: ({ id }) => id)

getUserDataById(user) // computed once
getUserDataById(sameUser) // still computed once
  1. Is options.truncateArgs needed if it mainly only avoids a small/trivial number of keystrokes in array mapper callbacks (and similar)? Might a simple example suffice instead?

Fair point, I'll remove that and add an example.

  1. Might options.cacheRejectedPromises be fine to omit if we have a reasonable default? We can add in the future if there's demand.

Probably the most sensible default is the current one, i.e. to remove promises from the cache upon rejection (after all, if the function threw synchronously, that result obviously wouldn't be cached either). And it can be worked around easily enough by wrapping the result/rejection of the inner function in something akin to Promise.allSettled's PromiseSettledResult<T>.

@iuioiua
Copy link
Contributor

iuioiua commented Aug 1, 2024

I see. So based on your feedback, I suggest we do the following:

  1. Keep the cache option, but remove it from the returned function.
  2. Keep the getKey option.
  3. Remove the truncateArgs option, keeping its default behavior, and instead add an example for said use case.
  4. Remove the cacheRejectedPromises, keeping its default behavior, and instead add an example for said use case.

Does this all sound reasonable? Again, if we see confirmed demand for these functionalities in the future, we can add them in a non-breaking manner. WDYT?

iuioiua
iuioiua previously requested changes Aug 7, 2024
Copy link
Contributor

@iuioiua iuioiua left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I just realise I hadn't clicked "Submit review" from like a week ago 🤦🏾‍♂️

cache/deno.json Outdated Show resolved Hide resolved
cache/lru_cache.ts Outdated Show resolved Hide resolved
cache/lru_cache.ts Outdated Show resolved Hide resolved
@iuioiua iuioiua requested a review from kt3k August 7, 2024 08:41
@iuioiua
Copy link
Contributor

iuioiua commented Aug 8, 2024

Please review once more @kt3k

Copy link
Member

@kt3k kt3k left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@kt3k kt3k merged commit 0c64f32 into denoland:main Aug 8, 2024
13 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

suggestion: add cache package
3 participants