-
-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
PROPOSAL: Cache policy #136
Comments
Are the ExecuteAndCapture variants going to be supported? |
Yes - good catch. And since the implementation obviously checks for a cached value before executing the user delegate, makes no difference to the operation of the |
I'm happy to work on this, as we will need this for a project I'm working on. |
@SeanFarrow Sounds good! Many thanks for your involvement and offering to work up a PR on this! Please do come back to me questions / comments etc as they arise. |
Ok, I will. I’m able to start working on this at the end of the month, I’ve got a major project to finish first! |
Sure @SeanFarrow . Thanks for all your support and involvement!
All sound like good options! And with the proposed Given we'd likely want to avoid taking dependencies on all these in the main Polly package, the individual cache implementations (default memory cache excepted) would probably go out as separate nuget packages |
@community : other caches you'd like to see supported? |
Definitely separate NuGet packages. That makes more sense in terms of dependencies, as each cache is likely to depend on other NuGet packages. Also, they can then be updated out of band of the main poly package assuming of course that the interface doesn’t change! I’m also thinking about a Poly.Cache.Core package containing the interface and the memory cache, as per my understanding, the memory cache is baked in to the .net framework—correct me if I’m wrong! In terms of other caches we may want to support, maybe azure cache/amazon elastic cache, I need to check whether the latter has an api specifically, or whether it’s just memcached compatible. |
@SeanFarrow All sounds good. Only thought: maybe the default |
Ok, fair point, we’ll go with that then! |
What version of .net are we supporting? I notice that poly supports .net 3.5, but the memory cache is 4.0+. |
Good question. Per #142 we will discontinue .NET3.5 support from Polly v5.0.0, as a number of the other new policies require facilities not in .NET3.5 either. |
Ok, cool, can you assign me to the cache policy then? |
Hey @SeanFarrow . Hmm. Seems from the github instructions I can't add you with the assignees button (same for Jerome and Bruno) because its scope is limited to AppvNext org members (since AppvNext also broader than Polly, not my position just to add you to AppvNext). No reflection on the importance of your contribution to Polly (great to have you involved!). Consider this assigned. Added the in progress label to indicate that it is spoken for! |
Ok, thanks! |
Given we're supporting async variants of execute, should we have an async cache provider as well? |
Hey @SeanFarrow Great questions. What were your thoughts? Some thoughts:
[1] Yes good call. Polly having an async cache provider so that async executions through Polly can take advantage of where 3rd-party caches have (mature/stable) async APIs –> we should do that. Feels like two separate interfaces in Polly for sync and async providers, ie // namespace Polly.Cache
interface IResultCacheProviderAsync<TResult>
{
TResult GetAsync(Context); // should return: Task<TResult>
void PutAsync(Context, TResult); // should return: Task
} NB If you think the design of the [2] The config overloads So: // Async policy taking async cache provider
CachePolicy<TResult> asyncCachePolicy = Policy
.CacheAsync<TResult>(IResultCacheProviderAsync<TResult> cacheProviderAsync);
// Async policy taking sync cache provider
CachePolicy<TResult> asyncCachePolicy = Policy
.CacheAsync<TResult>(IResultCacheProvider<TResult> cacheProvider); Re:
Again, really interested to hear your views. Thinking aloud from my side: [3] The config overloads configuring Polly's sync cache policy should probably only take sync cache providers. IE just: // Sync policy taking sync cache provider
CachePolicy<TResult> cachePolicy = Policy
.Cache<TResult>(IResultCacheProvider<TResult> cacheProvider); (The opposite – providing an overload for a sync cache policy taking an async provider – feels like creating a potentially confusing API. Particularly, there’s a risk people would mistake that syntax for giving them the benefits of async behaviour, when it’d not be: it’d have to be blocking on the calls to the async cache provider to bring it into a sync policy/sync call, no?) [4] But we could allow the use of third-party caches with async-only interfaces, in Polly’s sync Hmm. The choice of C# clients for some of these caches has moved on since I was last involved, in some cases. Which of the 3rd-party cache's are currently offering an async-only (no sync) API? Thoughts on all this? |
Ok, thinking out loud: Also, whilst I think about it, how do we want to handle the conversion from the cache to the TResult type? I’ve got a situation for example where I’m storing a base64-encoded compressed file (zip in this case), so I can’t just do new ZipArchive, or the equivalent, it needs an extra processing step! |
Oops on my part: yes definitely! (more on other q later) |
Hey @SeanFarrow . Great to have all this on the cache policy! Re:
Where were you thinking this would sit in the architecture? As part of the My instinct is to keep the main
rather than extend those with additional:
(The formatter probably makes sense for some kinds of cache but not others. And: So thinking of it structurally as a cache implementation concern, my instinct is for the transform-for-caching functionality being part of Sound sensible? / can you see disadvantages? / or just stating the obvious?? 👍 re conversion interface. If we went as above ... and if there were a group of cache implementations (cloud caches?) where this approach might be particularly useful, one could still eg structure that with an abstract base class taking a conversion interface like you say, and some cache implementations deriving from that ... Further thoughts? (You deep in and may see other angles! ) |
I agree with you re scoping. Finally, Bear in mind that converting a value might not be straightforward, take the case where you have cached some compressed data, to decompress this data might require more than just calling a class constructor, you may need to read from a memory stream for example. |
hey @SeanFarrow Great qs. Completely with you about needing conversion funcs rather than How do you see this:
looking in actual code? We'd need to avoid the various gotchas flowing from having optional parameters in interfaces (like the default values in the interface taking precedence over values in any implementations of the interface, if the call is being made against the interface not an implementation), but maybe that is not what you were thinking anyway? |
Hadn’t thought of that! |
Mutable policies by property-injection/setter-method injection a possible trap for the unwary in highly concurrent / multi-threaded scenario? (Setting output converter then executing not atomic; risk some thread sets the cache output converter while another thread is mid executing?). (Might not be the way we envisage it being used, but opens up the possibility) Constructor-injection somewhere (resulting in immutable policy) safer? |
Possibly, yes, but what if, I want a different converter per type? |
Hi,
All this is useful, but could you point me to the source and client source on GitHub?
Cheers
Sean.
From: mfjerome [mailto:notifications@github.com]
Sent: Wednesday, December 14, 2016 15:46
To: App-vNext/Polly <Polly@noreply.github.com>
Cc: Sean Farrow <sean.farrow@seanfarrow.co.uk>; Mention <mention@noreply.github.com>
Subject: Re: [App-vNext/Polly] PROPOSAL: Cache policy (#136)
@SeanFarrow<https://github.com/SeanFarrow> Hi Sean, excuse me for the delay.
· Main website: https://pivotal.io/en/big-data/pivotal-gemfire
· Documentation: http://gemfire.docs.pivotal.io/
· C# Client API documentation: http://gemfire.docs.pivotal.io/docs-gemfire/gemfire_nativeclient/dotnet-caching-api/gemfire-csharp-dotnet-api.html
· C# Client API code example: http://gemfire.docs.pivotal.io/docs-gemfire/gemfire_nativeclient/programming-examples/csharp-example.html
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub<#136 (comment)>, or mute the thread<https://github.com/notifications/unsubscribe-auth/ABY1fg999w29AIzThA7Q-QPSOMa5E8jtks5rIA8ugaJpZM4JJmSo>.
|
Hello, I'm curious if you have a prediction of when this feature might land? It seems like there's been some promising work, but it's gone quiet recently. I have a strong interest in using the caching policy in combination with retry and circuit breaker for HTTP calls. I'd also be happy to contribute if you need any help. |
@perfectsquircle Yes, among all the other features that got delivered at v5.0, this got left behind. I've wanted to take forward, but it's been behind other things: contribution would be very welcome! We have quite a developed architecture (thanks also to @SeanFarrow !), so the main thing we need now is some first cache provider implementations to plug into that. I've just re-based the architecture against latest Polly / stuff I'm about to release. Mini tour: You construct a CachePolicy specifying:
So a typical cache might be configured something like:
This test shows the basic usage. So we need to implement some ICacheProviders. @perfectsquircle Are you interested in in-process/local caching? (eg
The architecture also envisages support for serializers like Protobuf etc: let me know if you have any interest in that, and we can discuss further. Otherwise let's leave for now. I am very available for further help / guidance, if you want to work on this! Any of the above you'd be interested in tackling? (And: thank-you!) |
@perfectsquircle<https://github.com/perfectsquircle>
I’m also like @reisenberger happy to help, although my time is limited currently.
If you are more interested in local caches I’m happy to write the cloud-based ones. Let us know how you wish to help and we’ll give you any support that is needed.
|
Thank you for the comprehensive update. Maybe I'll get my feet wet and try to implement the memory or disk ICacheProvider. I suppose it would be sufficient to target .NET Standard 1.0 for these plugins? |
@perfectsquircle Great! I made a start on a skeleton Visual Studio solution, build file, Nuget Packager etc for MemoryCache repo early this morning. I can probably push that to github in about an hour's time ...
As low a .Net Standard version as we can get away with. It looks from package search as if lowest .NET Standard for MemoryCache might be .NET Standard 1.3. Fine if that's the case. Although the core Polly targets .NetStandard 1.0 (soon to change to .NetStandard 1.1 when we release #231), it shouldn't be a problem to make MemoryCache repo target .NET Standard 1.3 instead. The range of cache providers we're targeting will inevitably mean some have differing target support - delivering them through separate nuget pkgs will let us deal with that. |
@perfectsquircle At https://github.com/App-vNext/Polly.Caching.MemoryCache, there is now a repo ready to fork and develop on. TL;DR All we need to do now is start developing the I put in a dummy class and test only to test the build script ( The repo intentionally keeps the three-target layout (.NET4.0, .NET4.5 and .Net Standard) that Polly has, for now. Theoretically we could drop .NET4.5 as a separate target and have .NET4.5 consumers reference .Net Standard, but targeting NetStandard from NetFramework is very noisy until Microsoft (hopefully) fix this in .Net Standard 2.0. For MemoryCache, you may have to change the .Net Standard 1.0 package to target .Net Standard 1.3, if package search was accurate. (I left it at .Net Standard 1.0, so that this commit could be a useful master for other cache providers). Finally, to reference the interface Phew - but that gets us a baseline to develop on! Let me know if makes sense / whatever questions - whether around tooling or Huge thank you for your contribution! |
Hi @reisenberger, I haven't gotten around to working on this. Things got crazy at work. I might try take another crack at it again soon. |
Hi @reisenberger |
Hi Joe,
What parts of caching are you specifically interested in?
Kind regards
Sean.
|
@joelhulen I have pulled the latest Cache rebase down onto this branch on App-vNext/Polly. Build from this branch will publish an appropriately tagged pre-release Polly build which you can push to nuget. @JoeBrockhaus : @joelhulen plans to push the above to Nuget as a pre-release. @JoeBrockhaus We would welcome contributions if you are able to contribute to Polly cache implementation - let us know what you would be interested in doing! [ I can get back to CachePolicy myself likely in the second half of June. ] |
@reisenberger @JoeBrockhaus Sorry, the notification for this thread got lost amongst my piles of emails. Sometimes it's faster getting ahold of me on the Polly slack channel ;-) I'll work toward releasing the pre-release NuGet and notify everyone here once it's up. |
@reisenberger @JoeBrockhaus I've published those pre-release NuGet packages. Please let me know if you have any issues finding or using them. |
@SeanFarrow Sorry for the super-delay on this feedback. I was looking to incorporate a combination of Retry with a CircuitBreaker to proactively serve from Cache before failing on new requests whose dependencies would likely fail, but for which cached data would suffice. |
Don’t worry about the delay, what cache were you looking to use? Do you want sync or async?
|
Would likely be async, though i'm not sure if would be a blocker either way. |
Hi,
What specific cache(s) are you looking to use?
|
All, I've just been looking at the memory cache, we can't provide an async api, as one does not exist. Does anyone see a problem with this? |
Hi @SeanFarrow . Re:
I don't think this a significant problem. We can simply write an implementation for |
Is there an ETA on the caching feature? |
@dweggemans The caching feature is expected to be released in September. This branch https://github.com/App-vNext/Polly/tree/v5.3.x-cachebeta contains the latest caching architecture, ie the core classes within Polly to support This repo https://github.com/App-vNext/Polly.Caching.MemoryCache contains a beta-release of an @dweggemans : Are there particular cache providers you are looking to support? Community contributions to support new cache providers will be welcome: The required interfaces to implement (ISyncCacheProvider and/or IAsyncCacheProvider) are relatively straightforward. Polly contributors, eg @SeanFarrow , also already have a range of distributed cache providers in mind. |
@reisenberger thanks for your response. I might be able to wait a little, or else I'll build a package locally. No problem. The |
Closing via #332
The first cache provider implementation to go with The two will be released together to nuget, as soon as we hook up the build and nuget feed onto https://github.com/App-vNext/Polly.Caching.MemoryCache. /cc @joelhulen |
Proposal: Cache policy
Purpose
To provide the ability to serve results from a cache, instead of executing the governed func.
Scope
Func<TResult>
), not void (Action
) executions.Configuration
The cache item expiry duration would be configured on the
CachePolicy
at configuration time, not passed at execution time:CachePolicy
in line with the Polly model, where the behavioural characteristics of policies are defined at configuration time, not call time. This makes for a cache policy with given behaviour which can be shared across calls..Execute()
overloads onCachePolicy
would present problems for the integration ofCachePolicy
intoPolicyWrap
. (ThePolicyWrap
(was Pipeline) feature in its proposed from requires that all policies have common.Execute()
etc overloads.)Configuration syntax
For Policy<TResult> - default cache
For Policy<TResult> - advanced cache
Users may want more control over caching characteristics or to use an alternative cache provider (Http cache or third-party). The following overload is also proposed:
where:
Context
itself is not the key; it is an execution context that travels with each Execute invocation on a Polly policy. Implementations should derive a cache key to use from elements in theContext
. The usual cache key would beContext.ExecutionKey
. See FEATURE: Add KEYs to policies and executions #139IResultCacheProvider
Get/Put signatures aroundContext
rather than astring cacheKey
allows implementers power to develop a more complex caching strategy around other keys or user information on theContext
.Example execution syntax
(and other similar existing overloads taking a
Context
parameter)Default implementation
The proposed implementation for the simple cache configuration overload is to use
System.Runtime.Memory.MemoryCache.Default
and the configuredTimespan slidingExpirationTimespan
to create aPolly.Cache.MemoryCacheProvider<TResult> : Polly.Cache.IResultCacheProvider<TResult>
.Operation
InvalidCastException
if the value in the cache cannot be cast toTResult
executionFunc
if (and only if) a value could not be returned from cache.executionFunc
, caches it under the given cache key for the given timespan.Comments?
Comments? Alternative suggestions? Extra considerations to bear in mind?
Have scenarios to share where you'd use this? (could inform development)
The text was updated successfully, but these errors were encountered: