Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Random replacement cache #131

Open
wants to merge 5 commits into
base: master
Choose a base branch
from
Open

Conversation

jchook
Copy link

@jchook jchook commented Jun 9, 2020

What kind of change does this PR introduce? (Bug fix, feature, docs update, ...)

Feature

What is the current behaviour? (You can also link to an open issue here)

Very good.

What is the new behaviour?

Adds random replacement (RR) to the out-of-the-box cache strategies.

Does this PR introduce a breaking change? (What changes might users need to make in their application due to this PR?)

No.

Other information:

LRU and FIFO strategies can both lead to many guaranteed cache misses with "worst-case" access patterns. Random replacement alleviates this problem, enabling a predictable trade between cache size and computation, even with those access patterns.

Please check if the PR fulfills these requirements:

  • Tests for the changes have been added
  • Docs have been added / updated

@coveralls
Copy link

coveralls commented Jun 9, 2020

Coverage Status

Coverage remained the same at 100.0% when pulling dc2199c on jchook:random-cache into 8559efe on toomuchdesign:master.

@jchook jchook force-pushed the random-cache branch 5 times, most recently from 5fb1368 to 7ea7f60 Compare June 10, 2020 00:31
Copy link
Owner

@toomuchdesign toomuchdesign left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @jchook,
thanks for your detailed PR. Reading through it was a real pleasure.

I left a few comments and some ideas for a few changes. Feel free to reply and discuss about them.

I've never thought of implementing such a kind of cache. What do you use it for?

src/cache/RrMapCache.js Outdated Show resolved Hide resolved
src/cache/RrMapCache.js Show resolved Hide resolved
Comment on lines 23 to 28
const index = this._cacheKeys.indexOf(key); // O(1)
if (index > -1) {
delete this._cache.delete(key);
this._cacheLength--;
this._cacheKeys[index] = this._cacheKeys[this._cacheLength];
}
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If _cacheLength and _cacheKeys got removed, this could become a simple:

this._cache.delete(key);

..right? :)

Copy link
Author

@jchook jchook Nov 3, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

To follow-up here -- yes. It would be much simpler.

This is a CPU performance optimization at the cost of storing keys redundantly in memory.

Thinking of a better way to acquire a random key from a Map...

Comment on lines 11 to 18
if (this._cacheLength >= this._cacheSize) {
this._randomReplace(key, selectorFn);
} else {
this._cache.set(key, selectorFn);
this._cacheKeys[this._cacheLength] = key;
this._cacheLength++;
}
}
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If I get it right, here we have 2 different setting logic:

  • push new entry normally
  • replace existing entry when cache size limit is hit

Would it be possible to get the same behaviour with a single setting logic?

Something like:

1- delete existing random entry if cache limit is hit (_randomReplace might change its scope)
2- add new entry

I understand this would not be anymore a strict implementation of a RR cache but users would get the same result. If I haven't missed anything.

src/cache/RrObjectCache.js Outdated Show resolved Hide resolved
src/cache/RrObjectCache.js Outdated Show resolved Hide resolved
src/cache/RrObjectCache.js Outdated Show resolved Hide resolved
@jchook
Copy link
Author

jchook commented Jun 21, 2020

I left a few comments and some ideas for a few changes. Feel free to reply and discuss about them.

Thanks for taking the time to review!

I've never thought of implementing such a kind of cache. What do you use it for?

Basically I have run into "worst case" scenarios where the amount of data exceeds the cache size, plus a tight loop sequentially accesses cacheSize + n items. In that scenario, both LRU and FIFO have many guaranteed / systematic misses.

The RR method alleviates this issue somewhat, while still providing a reasonably competitive hit/miss ratio with other access patterns. I'm not an expert on the topic, but I can cite at least one study from 2004:

...in some cases, such as 253.perlbmk for 16KB and 32KB caches, Random policy significantly outperforms the other two.

As it could be expected, LRU policy in data caches has better performance than FIFO and Random, across almost all evaluated benchmarks and cache sizes. Yet there are some exceptions: for 301.appsi, 253.perlbmk, and 183.equake, Random policy is sometimes slightly better than LRU.

In L1 data cache there is no clear winner between FIFO and Random replacement policy, and the difference between the two decreases as the cache size increases. For one group of applications, 172.mgrid, 176.gcc, 197.parser, and 300.twolf, caches with FIFO replacement have less misses for all considered cache sizes and organizations. For other group, 191.fma3d, 186.crafty, and 183.equake, Random always outperforms FIFO. For the rest of the considered benchmarks, for smaller cache sizes FIFO dominates, while for larger caches Random policy is better or same as FIFO.

Most of the academic studies surrounding the issue focus on hardware cache, and don't apply directly to JavaScript land, but I think the intuition is there for why an RR cache might compliment FIFO + LRU.

@toomuchdesign
Copy link
Owner

I agree an RR cache could be an interesting tool to provide. If cache folder grew further I'd consider splitting the extra implementation in a separate package but I'll put this pain off.

I was curious to see how this library is used for applications never taken into account when it was originally written.

To make a long story short, what if we just:

  • rename _cacheKeys to _cacheOrdering
  • remove _cacheLength in favour of this._cache.size

Thanks again!

@toomuchdesign
Copy link
Owner

How are you doing @jchook?
Are you going to have any chance to complete the PR with the 2 changes we discussed above?

All the best 🖖 :)

@jchook
Copy link
Author

jchook commented Sep 5, 2020

Hey @toomuchdesign thanks for your cordial patience on this.

When deploying these changes in the wild, I discovered a performance issue with the RrObjectCache.prototype.remove() method on some platforms (e.g. Android via react-native). I would like to also fix that.

Currently I'm helping my mom build an addition on her home (fun! but also time-consuming). When that finishes I will have plenty of time. If you're willing to let this bake in the sun for a while I will eventually get to it. Otherwise we can close as I am happily using it in production now and just wanted to share.

@toomuchdesign
Copy link
Owner

Hi @jchook, great to hear from you!
This branch is all yours :)

Feel free to complete the job when you have time and ping me when you're done.
In the meanwhile enjoy your time building REAL things!

Cheers!

@jchook
Copy link
Author

jchook commented Nov 3, 2023

Hey, finally revisiting this PR.

After reviewing the PR, I think we should drop the RrObjectCache entirely and only offer an RrMapCache for the many reasons outlined here: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Map#objects_vs._maps

Performs better in scenarios involving frequent additions and removals of key-value pairs.

Also it was a joy to re-read your feedback and I made adjustments.

@toomuchdesign
Copy link
Owner

Hey @jchook,
happy to see you around!

Yep, I agree about removing the plain object caches. We might publish your PR as a minor and then release a major with the cleanup.

I've enabled tests, there seems to be something red :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants