-
-
Notifications
You must be signed in to change notification settings - Fork 4.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
CacheProvider for future scaling needs #168
Comments
Cache needs to be abstracted which a default memory implementation and providers for memcached or redis. |
I agree, this in IMO one of the single most needed feature for heavy production usage. |
👍 |
Abstracted out the cache to a CacheProvider class, might be worth looking into: https://github.com/maysale01/parse-server/tree/es6-all-tests-passing |
Providers (including cache) should return an promise to resolve, allowing to fetch/refresh data from external storage. Otherwise data on local cache may not be synced easily. It would be consistent to use Promise and not callbacks with the rest of the code. Something like
this way the cacheProvider may store data external and based on implementation logic choose to refresh or return a local copy... |
+1. The whole interface needs to be redesigned. I was just trying to get the tests passing with ES6 and some basic abstraction. I'll whip something up tomorrow. |
CacheProvider definitely sounds like a step in the right direction for sure. For one of our production apps we get crazy level of traffic, so we built a custom caching layer that sits ontop of Hosted Parse (we call it C3 — Cloud Code Cache). It's essentially an Objective-C The above is definitely way beyond what we need out of a cache for v1, but I thought I'd share the sort of things we get up to as it's quite exciting to be able to bake this advanced caving stuff right into parse-server itself. Perhaps access to CacheProvider via a module within our custom Cloud Code would be a smart move? |
I noticed that whenever a user authenticates with
Parse.User.loginIn
, they're added to theusers
array in cache.js and they're only ever removed when a user forcefully signs out (which in our case, is quite rare) or when the node process restarts/crashes.This might be fine for light traffic loads or for development use — but I can see this potentially becoming a problem for apps with large amounts of traffic as it is likely to gobble up unnecessary RAM. It may also create inconsistency between load balanced nodes/dynos.
I'm proposing at some point in the future (probably not as a priority) we make the switch to Redis or memcached to hold these variables somewhere consistent (especially if load balancing traffic etc). This might also pave the way for some more advanced query caching in the future too to minimise hits to the popular database for queries that don't often change. Heroku has quite a few hosted Redis/Memcached options so it should be fairly trivial for most users to optionally switch this behaviour on if they want.
Happy to contribute but thought I'd open up an issue first so it can be discussed and tagged up as up for grabs.
The text was updated successfully, but these errors were encountered: