Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

PROPOSAL: Cache policy #136

Closed
reisenberger opened this issue Jul 11, 2016 · 76 comments
Closed

PROPOSAL: Cache policy #136

reisenberger opened this issue Jul 11, 2016 · 76 comments

Comments

@reisenberger
Copy link
Member

reisenberger commented Jul 11, 2016

Proposal: Cache policy

Purpose

To provide the ability to serve results from a cache, instead of executing the governed func.

Scope

  • A non-exception handling policy.
  • All sync and async variants
  • Only result-returning (Func<TResult>), not void (Action) executions.

Configuration

The cache item expiry duration would be configured on the CachePolicy at configuration time, not passed at execution time:

  • This keeps CachePolicy in line with the Polly model, where the behavioural characteristics of policies are defined at configuration time, not call time. This makes for a cache policy with given behaviour which can be shared across calls.
  • Adding bespoke .Execute() overloads on CachePolicy would present problems for the integration of CachePolicy into PolicyWrap. (The PolicyWrap (was Pipeline) feature in its proposed from requires that all policies have common .Execute() etc overloads.)

Configuration syntax

For Policy<TResult> - default cache

CachePolicy<TResult> cachePolicy = Policy
  .Cache<TResult>(TimeSpan slidingExpirationTimespan);

For Policy<TResult> - advanced cache

Users may want more control over caching characteristics or to use an alternative cache provider (Http cache or third-party). The following overload is also proposed:

CachePolicy<TResult> cachePolicy = Policy
  .Cache<TResult>(IResultCacheProvider<TResult> cacheProvider);

where:

// namespace Polly.Cache
interface IResultCacheProvider<TResult>
{
   TResult Get(Context);
   void    Put(Context, TResult);
}
  • The Context itself is not the key; it is an execution context that travels with each Execute invocation on a Polly policy. Implementations should derive a cache key to use from elements in the Context. The usual cache key would be Context.ExecutionKey. See FEATURE: Add KEYs to policies and executions #139
  • Basing the IResultCacheProvider Get/Put signatures around Context rather than a string cacheKey allows implementers power to develop a more complex caching strategy around other keys or user information on the Context.

Example execution syntax

// TResult form
TResult result = cachePolicy
  .Execute(Func<TResult> executionFunc, new Context(executionKey)); // The executionKey is the cacheKey.  See keys proposal.

(and other similar existing overloads taking a Context parameter)

Default implementation

The proposed implementation for the simple cache configuration overload is to use System.Runtime.Memory.MemoryCache.Default and the configured Timespan slidingExpirationTimespan to create a Polly.Cache.MemoryCacheProvider<TResult> : Polly.Cache.IResultCacheProvider<TResult>.

Operation

  • Checks the cache for a value stored under the cache key; returns it if so.
    • Throws an InvalidCastException if the value in the cache cannot be cast to TResult
  • Invokes executionFunc if (and only if) a value could not be returned from cache.
  • Before returning a result from a non-faulting invoking executionFunc, caches it under the given cache key for the given timespan.

Comments?

Comments? Alternative suggestions? Extra considerations to bear in mind?

Have scenarios to share where you'd use this? (could inform development)

@SeanFarrow
Copy link
Contributor

Are the ExecuteAndCapture variants going to be supported?

@reisenberger
Copy link
Member Author

Are the ExecuteAndCapture variants going to be supported?

Yes - good catch. And since the implementation obviously checks for a cached value before executing the user delegate, makes no difference to the operation of the CachePolicy whether any fault from the func is rethrown/passed to more outer layers, or captured.

@SeanFarrow
Copy link
Contributor

I'm happy to work on this, as we will need this for a project I'm working on.

@reisenberger
Copy link
Member Author

@SeanFarrow Sounds good! Many thanks for your involvement and offering to work up a PR on this! Please do come back to me questions / comments etc as they arise.

@SeanFarrow
Copy link
Contributor

SeanFarrow commented Jul 13, 2016

Ok, I will. I’m able to start working on this at the end of the month, I’ve got a major project to finish first!
What caches do we want, I’m thinking, Redis/Memcached, and the .net memory cache as well as maybe a disc based cache, with the memory cache being the default. Any other caches/thoughts?

@reisenberger
Copy link
Member Author

I’m able to start working on this at the end of the month

Sure @SeanFarrow . Thanks for all your support and involvement!

What caches do we want, I’m thinking, Redis/Memcached, and the .net memory cache
as well as maybe a disc based cache, with the memory cache being the default.
Any other caches/thoughts?

All sound like good options!

And with the proposed IResultCacheProvider<TResult> interface, people can of course easily implement others.

Given we'd likely want to avoid taking dependencies on all these in the main Polly package, the individual cache implementations (default memory cache excepted) would probably go out as separate nuget packages Polly.Cache.Redis, Polly.Cache.Memcached etc, do you think? Unless they were each individually so small (in terms of code lines) that providing a wiki page for each was just as much an option - to decide later?

@reisenberger
Copy link
Member Author

@community : other caches you'd like to see supported?

@SeanFarrow
Copy link
Contributor

SeanFarrow commented Jul 14, 2016

Definitely separate NuGet packages. That makes more sense in terms of dependencies, as each cache is likely to depend on other NuGet packages. Also, they can then be updated out of band of the main poly package assuming of course that the interface doesn’t change!

I’m also thinking about a Poly.Cache.Core package containing the interface and the memory cache, as per my understanding, the memory cache is baked in to the .net framework—correct me if I’m wrong!

In terms of other caches we may want to support, maybe azure cache/amazon elastic cache, I need to check whether the latter has an api specifically, or whether it’s just memcached compatible.

@reisenberger
Copy link
Member Author

@SeanFarrow All sounds good.

Only thought: maybe the default MemoryCache option (and IResultCacheProvider<TResult> interface) can just be part of the main Polly package, so that the CachePolicy (based on the MemoryCache default) works out-of-the-box with just the main Polly package. Yes, MemoryCache is part of the runtime at System.Runtime.Caching.MemoryCache

@SeanFarrow
Copy link
Contributor

SeanFarrow commented Jul 14, 2016

Ok, fair point, we’ll go with that then!

@SeanFarrow
Copy link
Contributor

What version of .net are we supporting? I notice that poly supports .net 3.5, but the memory cache is 4.0+.
Do people see this as an issue? if so, what should we do for caching in .net 3.5?

@reisenberger
Copy link
Member Author

Good question. Per #142 we will discontinue .NET3.5 support from Polly v5.0.0, as a number of the other new policies require facilities not in .NET3.5 either.

@SeanFarrow
Copy link
Contributor

SeanFarrow commented Jul 17, 2016

Ok, cool, can you assign me to the cache policy then?

@reisenberger
Copy link
Member Author

reisenberger commented Jul 17, 2016

Hey @SeanFarrow . Hmm. Seems from the github instructions I can't add you with the assignees button (same for Jerome and Bruno) because its scope is limited to AppvNext org members (since AppvNext also broader than Polly, not my position just to add you to AppvNext). No reflection on the importance of your contribution to Polly (great to have you involved!). Consider this assigned. Added the in progress label to indicate that it is spoken for!

@SeanFarrow
Copy link
Contributor

SeanFarrow commented Jul 17, 2016

Ok, thanks!

@SeanFarrow
Copy link
Contributor

Given we're supporting async variants of execute, should we have an async cache provider as well?
Also, if the cache doesn't support synchronous functionality, what should our default position be?

@reisenberger
Copy link
Member Author

reisenberger commented Jul 18, 2016

Hey @SeanFarrow Great questions. What were your thoughts?

Some thoughts:

Given we're supporting async variants of execute, should we have an async cache provider as well?

[1] Yes good call. Polly having an async cache provider so that async executions through Polly can take advantage of where 3rd-party caches have (mature/stable) async APIs –> we should do that. Feels like two separate interfaces in Polly for sync and async providers, ie IResultCacheProvider<TResult> and IResultCacheProviderAsync<TResult>? That what you thinking? Async interface sthg like:

// namespace Polly.Cache
interface IResultCacheProviderAsync<TResult>
{
   TResult GetAsync(Context); // should return: Task<TResult>
   void    PutAsync(Context, TResult); // should return: Task
}

NB If you think the design of the IResultCacheProvider/Async<TResult> interfaces can be refined, feel free to say.

[2] The config overloads .CacheAsync<TResult>(...) configuring Polly’s async cache policies should probably provide options to take either a sync or an async cache provider tho. Because there might be some cache providers which only have sync APIs but we still want to offer them to async cache policies? (MemoryCache.Default is in this category?)

So:

// Async policy taking async cache provider
CachePolicy<TResult> asyncCachePolicy = Policy
  .CacheAsync<TResult>(IResultCacheProviderAsync<TResult> cacheProviderAsync);
// Async policy taking sync cache provider
CachePolicy<TResult> asyncCachePolicy = Policy
  .CacheAsync<TResult>(IResultCacheProvider<TResult> cacheProvider);

Re:

if the cache doesn't support synchronous functionality, what should our default position be?

Again, really interested to hear your views. Thinking aloud from my side:

[3] The config overloads configuring Polly's sync cache policy should probably only take sync cache providers. IE just:

// Sync policy taking sync cache provider
CachePolicy<TResult> cachePolicy = Policy
  .Cache<TResult>(IResultCacheProvider<TResult> cacheProvider);

(The opposite – providing an overload for a sync cache policy taking an async provider – feels like creating a potentially confusing API. Particularly, there’s a risk people would mistake that syntax for giving them the benefits of async behaviour, when it’d not be: it’d have to be blocking on the calls to the async cache provider to bring it into a sync policy/sync call, no?)

[4] But we could allow the use of third-party caches with async-only interfaces, in Polly’s sync CachePolicys, if desired, by providing an implementation fulfilling Polly’s IResultCacheProvider<TResult> sync interface, just .Wait()-ing (or equiv) on the async calls. (And NB documenting that this is what it does!). Arguments for / against doing that? Doing it that way round, at least there’s no mistaking from the API we provide that you’re getting blocking/sync behaviour.

Hmm. The choice of C# clients for some of these caches has moved on since I was last involved, in some cases. Which of the 3rd-party cache's are currently offering an async-only (no sync) API?

Thoughts on all this?

@SeanFarrow
Copy link
Contributor

SeanFarrow commented Jul 18, 2016

Ok, thinking out loud:
In my mind the AsyncCachePolicy should return a Task and Task for get/put methods respectively.
Agree with 2, 3 and 4.
I haven’t checked specifics as yet, but generally anything cloud based will offer async and may offer sync, but they are moving towards the former only fairly rapidly.

Also, whilst I think about it, how do we want to handle the conversion from the cache to the TResult type?
Sometimes it may not be as straightforward as doing new T, should we offer the capability to define a delegate/lambda, or a conversion interface?

I’ve got a situation for example where I’m storing a base64-encoded compressed file (zip in this case), so I can’t just do new ZipArchive, or the equivalent, it needs an extra processing step!
Also, this may be valid, if you are storing the content of a web response a an array of bytes.
Thoughts…?

@reisenberger
Copy link
Member Author

In my mind the AsyncCachePolicy should
return a Task and Task for get/put methods respectively.

Oops on my part: yes definitely!

(more on other q later)

@reisenberger
Copy link
Member Author

Hey @SeanFarrow . Great to have all this on the cache policy!

Re:

how do we want to handle the conversion from the cache to the TResult type?
[...] should we offer the capability to define a delegate/lambda, or a conversion interface?

Where were you thinking this would sit in the architecture? As part of the CachePolicy configuration overloads, or in the IResultCacheProvider implementations?

My instinct is to keep the main CachePolicy configuration overloads simple-as-possible, ie we have the TimeSpan varieties plus:

.Cache<TResult>(IResultCacheProvider<TResult> cacheProvider) [and]
.CacheAsync<TResult>(IResultCacheProviderAsync<TResult> cacheProviderAsync)

rather than extend those with additional:

.Cache<TResult, TCachedFormat>(IResultCacheProvider<TResult> cacheProvider, ICacheValueFormatter<TResult, TCachedFormat> cacheValueFormatter) [etc]

(The formatter probably makes sense for some kinds of cache but not others. And: IResultCacheProvider<TResult> cacheProvider feels like the correct scope of interface to configure a CachePolicy ... for the policy to use the cache, all you need to know is that you can get and put in and out of it ... if some cache implementations prefer to compress or map to a more cloud-friendly format, that feels like a cache implementation concern. )

So thinking of it structurally as a cache implementation concern, my instinct is for the transform-for-caching functionality being part of IResultCacheProvider/Async implementations where needed, config'd on them where needed.

Sound sensible? / can you see disadvantages? / or just stating the obvious??

👍 re conversion interface. If we went as above ... and if there were a group of cache implementations (cloud caches?) where this approach might be particularly useful, one could still eg structure that with an abstract base class taking a conversion interface like you say, and some cache implementations deriving from that ...

Further thoughts? (You deep in and may see other angles! )

@SeanFarrow
Copy link
Contributor

SeanFarrow commented Jul 18, 2016

I agree with you re scoping.
It may be that certain keys are compressed/others are serialized in different ways, so we may not be able to use a base class here, we could put an ICacheOutputConverter interface as part of the get/put calls, defaulting to null. If the converter is null we just use the default which does a new T. That way it’s up to the user to decide/implement converters. We could provide some converters out of the box, such as serializing to/from JSon. If no converter is passed to put, we just use the caches native put call.

Finally, Bear in mind that converting a value might not be straightforward, take the case where you have cached some compressed data, to decompress this data might require more than just calling a class constructor, you may need to read from a memory stream for example.
Thoughts…?

@reisenberger
Copy link
Member Author

reisenberger commented Jul 19, 2016

hey @SeanFarrow Great qs. Completely with you about needing conversion funcs rather than new-ing items out of cache. (Defined on an ICacheOutputConverter<TResult> interface or similar like you suggest sounds good!)

How do you see this:

we could put an ICacheOutputConverter interface as part of the get/put calls,
defaulting to null. If the converter is null we just use the default which does a new T.

looking in actual code? We'd need to avoid the various gotchas flowing from having optional parameters in interfaces (like the default values in the interface taking precedence over values in any implementations of the interface, if the call is being made against the interface not an implementation), but maybe that is not what you were thinking anyway?

@SeanFarrow
Copy link
Contributor

SeanFarrow commented Jul 19, 2016

Hadn’t thought of that!
OK, how about having a SetCacheOutputConverter on the cache interface?

@reisenberger
Copy link
Member Author

reisenberger commented Jul 19, 2016

how about having a SetCacheOutputConverter on the cache interface?

Mutable policies by property-injection/setter-method injection a possible trap for the unwary in highly concurrent / multi-threaded scenario? (Setting output converter then executing not atomic; risk some thread sets the cache output converter while another thread is mid executing?). (Might not be the way we envisage it being used, but opens up the possibility)

Constructor-injection somewhere (resulting in immutable policy) safer? ICacheOutputConverter<TResult> could perhaps be constructor-injected into the class fulfilling IResultCacheProvider/Async? What do you think?

@SeanFarrow
Copy link
Contributor

SeanFarrow commented Jul 19, 2016

Possibly, yes, but what if, I want a different converter per type?
We will need to support passing in an Ienumerable of converters.

@SeanFarrow
Copy link
Contributor

SeanFarrow commented Dec 15, 2016 via email

@perfectsquircle
Copy link

Hello,

I'm curious if you have a prediction of when this feature might land? It seems like there's been some promising work, but it's gone quiet recently. I have a strong interest in using the caching policy in combination with retry and circuit breaker for HTTP calls.

I'd also be happy to contribute if you need any help.

@reisenberger
Copy link
Member Author

reisenberger commented May 1, 2017

@perfectsquircle Yes, among all the other features that got delivered at v5.0, this got left behind. I've wanted to take forward, but it's been behind other things: contribution would be very welcome!

We have quite a developed architecture (thanks also to @SeanFarrow !), so the main thing we need now is some first cache provider implementations to plug into that. I've just re-based the architecture against latest Polly / stuff I'm about to release. Mini tour:

You construct a CachePolicy specifying:

  • ITtlStrategy: defines TTL for the items being cached by the CachePolicy. Various implementations already written.
  • ICacheKeyStrategy: defines what key to use to Get/Put in the cache. The default strategy is based on a value in the Context passed when .Execute(...)-ing on the policy. Users can write more elaborate strategies if they want (I will blog examples).
  • ICacheProvider: a simple Get/Put interface for any cache provider Polly could use. ICacheProviderAsync, similar interface for async.

So a typical cache might be configured something like:

Policy.Cache(
    myCacheProvider, 
    TimeSpan.FromHours(1) // or more specific ITtlStrategy
     /*, custom cache key strategy if desired */)

This test shows the basic usage.


So we need to implement some ICacheProviders. @perfectsquircle Are you interested in in-process/local caching? (eg MemoryCache, disk cache), or more cloud-caching (eg Redis) or ...? Any contribution in any of these would be welcome! Even just an initial ICacheProvider implementation based on System.Runtime.Caching.MemoryCache, would be enough to launch the feature.

  • ICacheProvider implementations will often depend on third-party libraries, and we didn't want the main Polly package to take those dependencies, so each ICacheProvider would be delivered as a separate Nuget, built out of a separate github repo.
  • There are skeleton repos which you (/anyone interested in contributing!) can fork for MemoryCache, disk, Redis, etc. We can make new repos for any other cache provider people want to support.
  • We'd need a build script for each of those repos to run tests and make the nuget package (I can help if needed/useful).

The architecture also envisages support for serializers like Protobuf etc: let me know if you have any interest in that, and we can discuss further. Otherwise let's leave for now.

I am very available for further help / guidance, if you want to work on this! Any of the above you'd be interested in tackling? (And: thank-you!)

@SeanFarrow
Copy link
Contributor

SeanFarrow commented May 1, 2017 via email

@perfectsquircle
Copy link

@reisenberger

Thank you for the comprehensive update. Maybe I'll get my feet wet and try to implement the memory or disk ICacheProvider. I suppose it would be sufficient to target .NET Standard 1.0 for these plugins?

@reisenberger
Copy link
Member Author

reisenberger commented May 2, 2017

@perfectsquircle Great!

I made a start on a skeleton Visual Studio solution, build file, Nuget Packager etc for MemoryCache repo early this morning. I can probably push that to github in about an hour's time ...

I suppose it would be sufficient to target .NET Standard 1.0 for these plugins?

As low a .Net Standard version as we can get away with. It looks from package search as if lowest .NET Standard for MemoryCache might be .NET Standard 1.3. Fine if that's the case. Although the core Polly targets .NetStandard 1.0 (soon to change to .NetStandard 1.1 when we release #231), it shouldn't be a problem to make MemoryCache repo target .NET Standard 1.3 instead. The range of cache providers we're targeting will inevitably mean some have differing target support - delivering them through separate nuget pkgs will let us deal with that.

@reisenberger
Copy link
Member Author

@perfectsquircle At https://github.com/App-vNext/Polly.Caching.MemoryCache, there is now a repo ready to fork and develop on.

TL;DR All we need to do now is start developing the Polly.Caching.MemoryCache.MemoryCacheProvider : Polly.Caching.ICacheProvider within the Polly.Caching.MemoryCache.Shared area of this repo, and specs in SharedSpecs.

I put in a dummy class and test only to test the build script (build.bat) was working: can be deleted.

The repo intentionally keeps the three-target layout (.NET4.0, .NET4.5 and .Net Standard) that Polly has, for now. Theoretically we could drop .NET4.5 as a separate target and have .NET4.5 consumers reference .Net Standard, but targeting NetStandard from NetFramework is very noisy until Microsoft (hopefully) fix this in .Net Standard 2.0.

For MemoryCache, you may have to change the .Net Standard 1.0 package to target .Net Standard 1.3, if package search was accurate. (I left it at .Net Standard 1.0, so that this commit could be a useful master for other cache providers).

Finally, to reference the interface Polly.Caching.ICacheProvider, you'd need to be able to reference a Polly nuget which includes it. Which obviously isn't public yet. So the procedure would be clone
https://github.com/reisenberger/Polly/tree/v5.1.x-cache-rebase locally, run its build script, and reference the Polly nugets the build script places in the artifacts\nuget-package directory.

Phew - but that gets us a baseline to develop on!

Let me know if makes sense / whatever questions - whether around tooling or MemoryCacheProvider intent.

Huge thank you for your contribution!

@perfectsquircle
Copy link

Hi @reisenberger,

I haven't gotten around to working on this. Things got crazy at work. I might try take another crack at it again soon.

@JoeBrockhaus
Copy link

JoeBrockhaus commented May 26, 2017

Hi @reisenberger
Is there any chance you could setup a beta/alpha myget/vso feed based off the v5.1x-cache-rebase (if that's still the latest) branch?

@SeanFarrow
Copy link
Contributor

SeanFarrow commented May 26, 2017 via email

@reisenberger
Copy link
Member Author

reisenberger commented May 27, 2017

@joelhulen I have pulled the latest Cache rebase down onto this branch on App-vNext/Polly. Build from this branch will publish an appropriately tagged pre-release Polly build which you can push to nuget.

@JoeBrockhaus : @joelhulen plans to push the above to Nuget as a pre-release.

@JoeBrockhaus We would welcome contributions if you are able to contribute to Polly cache implementation - let us know what you would be interested in doing!

[ I can get back to CachePolicy myself likely in the second half of June. ]

@joelhulen
Copy link
Member

joelhulen commented Jun 2, 2017

@reisenberger @JoeBrockhaus Sorry, the notification for this thread got lost amongst my piles of emails. Sometimes it's faster getting ahold of me on the Polly slack channel ;-)

I'll work toward releasing the pre-release NuGet and notify everyone here once it's up.

@joelhulen
Copy link
Member

@reisenberger @JoeBrockhaus I've published those pre-release NuGet packages. Please let me know if you have any issues finding or using them.

@JoeBrockhaus
Copy link

JoeBrockhaus commented Jun 8, 2017

@SeanFarrow Sorry for the super-delay on this feedback.

I was looking to incorporate a combination of Retry with a CircuitBreaker to proactively serve from Cache before failing on new requests whose dependencies would likely fail, but for which cached data would suffice.

@SeanFarrow
Copy link
Contributor

SeanFarrow commented Jun 8, 2017 via email

@JoeBrockhaus
Copy link

JoeBrockhaus commented Jun 8, 2017

Would likely be async, though i'm not sure if would be a blocker either way.
I have had to move onto other priorities in the meantime, unfortunately.
I'll try to find some time to poke it in the next couple days. 😀

@SeanFarrow
Copy link
Contributor

SeanFarrow commented Jun 8, 2017 via email

@SeanFarrow
Copy link
Contributor

All,

I've just been looking at the memory cache, we can't provide an async api, as one does not exist. Does anyone see a problem with this?

@reisenberger
Copy link
Member Author

Hi @SeanFarrow . Re:

I've just been looking at the memory cache, we can't provide an async api, as one does not exist. Does anyone see a problem with this?

I don't think this a significant problem. We can simply write an implementation for CacheAsync(...) that addresses a sync cache provider instead of an async one, at this line (and similar). It may mean a few extra configuration overloads, with the compiler selecting the right overload. We can add this when we next visit the cache architecture.

@dweggemans
Copy link

Is there an ETA on the caching feature?

@reisenberger
Copy link
Member Author

@dweggemans The caching feature is expected to be released in September.

This branch https://github.com/App-vNext/Polly/tree/v5.3.x-cachebeta contains the latest caching architecture, ie the core classes within Polly to support CachePolicy. The build script will generate locally a nuget package for same.

This repo https://github.com/App-vNext/Polly.Caching.MemoryCache contains a beta-release of an ISyncCacheProvider and IAsyncCacheProvider implementation for MemoryCache. The build script will generate locally a beta nuget package for same. /cc @SeanFarrow

@dweggemans : Are there particular cache providers you are looking to support? Community contributions to support new cache providers will be welcome: The required interfaces to implement (ISyncCacheProvider and/or IAsyncCacheProvider) are relatively straightforward.

Polly contributors, eg @SeanFarrow , also already have a range of distributed cache providers in mind.

@dweggemans
Copy link

@reisenberger thanks for your response. I might be able to wait a little, or else I'll build a package locally. No problem.

The MemoryCache suits my needs perfectly. I'm just looking for a simple way to reduce some traffic by caching results locally.

@reisenberger reisenberger modified the milestones: 5.0.0, v5.4.0 Oct 22, 2017
@reisenberger
Copy link
Member Author

Closing via #332

CachePolicy has been merged into the master branch, for release shortly as part of Polly v5.4.0.

The first cache provider implementation to go with CachePolicy - based around .NET's in-built MemoryCache - is available at: https://github.com/App-vNext/Polly.Caching.MemoryCache.

The two will be released together to nuget, as soon as we hook up the build and nuget feed onto https://github.com/App-vNext/Polly.Caching.MemoryCache. /cc @joelhulen

Doco at: https://github.com/App-vNext/Polly/wiki/Cache

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

7 participants