Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ConfigurationRoot holding too many ConfigurationReloadToken - Possible memory leak #2576

Closed
nkpun opened this issue Oct 30, 2019 · 5 comments
Milestone

Comments

@nkpun
Copy link

nkpun commented Oct 30, 2019

We have built Microservice using NETCORE 2.2. On load test we noticed a significant memory growth. The Dump shows below:

image

Memory dump Analysis:
Seems “System.Threading.CancellationTokenSource+CallbackNode” was being repeated called for large amount of times. While each of them need to hold a System.Threading.IAsyncLocal[] object.
All these calls can be traced back to Microsoft.Extensions.Configuration.ConfigurationReloadToken.

0:000> !gcroot 53184fec
Thread 31f0:
    0057E9F4 6A3511DA System.Threading.Tasks.Task.SpinThenBlockingWait(Int32, System.Threading.CancellationToken) [E:\A\_work\367\s\src\mscorlib\src\System\Threading\Tasks\Task.cs @ 2959]
        ebp+28: 0057e9fc
            ->  03313554 System.Runtime.CompilerServices.AsyncTaskMethodBuilder`1+AsyncStateMachineBox`1[[System.Threading.Tasks.VoidTaskResult, System.Private.CoreLib],[Microsoft.AspNetCore.Hosting.WebHostExtensions+<RunAsync>d__4, Microsoft.AspNetCore.Hosting]]
            . . .
            . . .
            . . .
            ->  0322D2EC System.Func`2[[Microsoft.ApplicationInsights.Extensibility.ITelemetryProcessor, Microsoft.ApplicationInsights],[Microsoft.ApplicationInsights.Extensibility.ITelemetryProcessor, Microsoft.ApplicationInsights]][]
            ->  0322C920 System.Func`2[[Microsoft.ApplicationInsights.Extensibility.ITelemetryProcessor, Microsoft.ApplicationInsights],[Microsoft.ApplicationInsights.Extensibility.ITelemetryProcessor, Microsoft.ApplicationInsights]]
            ->  0322861C Microsoft.DiagnosticServices.SnapshotCollector.HostingStartup.Startup+SnapshotCollectorTelemetryProcessorFactory
            ->  031FE110 Microsoft.Extensions.DependencyInjection.ServiceLookup.ServiceProviderEngineScope
            ->  032055E8 System.Collections.Generic.List`1[[System.IDisposable, System.Private.CoreLib]]
            ->  03540334 System.IDisposable[]
            ->  03204660 Microsoft.Extensions.Logging.Console.ConsoleLoggerProvider
            . . .
            . . .
            . . .
            ->  0320407C Microsoft.Extensions.Configuration.ConfigurationRoot
            ->  03204018 System.Collections.Generic.List`1[[Microsoft.Extensions.Configuration.IConfigurationProvider, Microsoft.Extensions.Configuration.Abstractions]]
            ->  0320403C Microsoft.Extensions.Configuration.IConfigurationProvider[]
            ->  03204030 Microsoft.Extensions.Configuration.ChainedConfigurationProvider
            ->  03203ED0 Microsoft.Extensions.Configuration.ConfigurationSection
            ->  0312F89C Microsoft.Extensions.Configuration.ConfigurationRoot
            ->  0312F8AC Microsoft.Extensions.Configuration.ConfigurationReloadToken
           ->  0312F8B8 System.Threading.CancellationTokenSource
            ->  031EB964 System.Threading.CancellationTokenSource+CallbackPartition[]
            ->  031EB974 System.Threading.CancellationTokenSource+CallbackPartition
            ->  782E56C0 System.Threading.CancellationTokenSource+CallbackNode
           . . .
            . . .
            . . .
            ->  531875D0 System.Threading.CancellationTokenSource+CallbackNode
            ->  531874DC System.Threading.ExecutionContext
            ->  53184FEC System.Threading.IAsyncLocal[]

I found the similar bug been reported here:
#861

However, this bug seems to only been fixed in NETCORE 3.0.0.

Given that we are facing this issue on NETCORE 2.2, can you please advice:

  1. Whether this fix has been back ported to NETCORE 2.2 or less?
  2. IF not how do we apply this fix on NETCORE2.2 without having to upgrade to 3.0.0.

We also tried setting the disabling reload but didn't help.

@Tratcher
Copy link
Member

@davidfowl Does this match your investigation from #861?

@davidfowl
Copy link
Member

@nkpun Can you reproduce this? Are you reloading configuration at runtime? Can you email the details david.fowler at microsoft.com (we can also figure out how to get me the memory dump).

@nkpun
Copy link
Author

nkpun commented Oct 31, 2019

Thanks @davidfowl , I've sent email with all the details. Let know if you don't receive within few minutes.

@nkpun
Copy link
Author

nkpun commented Nov 1, 2019

Thanks heaps for your help @davidfowl on helping us identify the root cause of memory leak.

For others benefit, see screenshot below on cause of steady memory growth. Serilog was configured as Transient as opposed to Singleton. Hence it was raising too many Reload Token calls.

image

@analogrelay analogrelay added this to the Backlog milestone Jan 23, 2020
@ghost ghost added the Status: Stale label May 8, 2020
@ghost
Copy link

ghost commented May 8, 2020

As part of the migration of components from dotnet/extensions to dotnet/runtime (aspnet/Announcements#411) we will be bulk closing some of the older issues. If you are still interested in having this issue addressed, just comment and the issue will be automatically reactivated (even if you aren't the author). When you do that, I'll page the team to come take a look. If you've moved on or workaround the issue and no longer need this change, just ignore this and the issue will be closed in 7 days.

If you know that the issue affects a package that has moved to a different repo, please consider re-opening the issue in that repo. If you're unsure, that's OK, someone from the team can help!

@ghost ghost closed this as completed May 15, 2020
@ghost ghost locked as resolved and limited conversation to collaborators May 26, 2023
This issue was closed.
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants