Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory Leak Introduced in 1.7.2 #455

Closed
BenRKarl opened this issue Mar 9, 2020 · 4 comments
Closed

Memory Leak Introduced in 1.7.2 #455

BenRKarl opened this issue Mar 9, 2020 · 4 comments
Assignees
Labels
priority: p1 Important issue which blocks shipping the next release. Will be fixed prior to next release. 🚨 This issue needs some love. type: bug Error or flaw in code with unintended results or allowing sub-optimal usage patterns.

Comments

@BenRKarl
Copy link

BenRKarl commented Mar 9, 2020

Bug discovered via google-ads-python, original Issue here.

Environment details

  • OS: Ubuntu 18.04 and also my gLinux desktop.
  • Python version: 3.7.0
  • pip version: 10.0.1
  • google-auth version: 1.7.2+

Steps to reproduce

Not exactly sure since this is exposed in google-ads-python via requests that require user credentials, however the code in this change appears to be the cause. Specifically closing threads in the __del__ method of AuthMetadataPlugin may be causing problems because there are no guarantees around when it's called, and in our case it seems like it may not be called at all.

@busunkim96 busunkim96 self-assigned this Mar 9, 2020
@busunkim96 busunkim96 added priority: p1 Important issue which blocks shipping the next release. Will be fixed prior to next release. type: bug Error or flaw in code with unintended results or allowing sub-optimal usage patterns. labels Mar 9, 2020
@busunkim96
Copy link
Contributor

@lidizheng Could you provide guidance on the best way to implement gRPC's AuthMetadataPlugin.__call__? I think this library's implementation (using threadpools) was modeled closely on grpc/_auth.py.

@lidizheng
Copy link
Contributor

Okay. This is a bit awkward.

The code has been out for 3 years, I guess we need to improve its design. I might get back with a PR or a CL? Which one is preferred?

@busunkim96
Copy link
Contributor

Thanks for the quick response! A PR would be preferred. I'm also happy to tweak this myself if you can point me in the right direction.

To be completely honest, I have not worked with threading/threadpool, so I was wondering if there was some obvious way to correct this that I was missing. 😅

@lidizheng
Copy link
Contributor

lidizheng commented Mar 10, 2020

I have a workaround solution that instead of creating a large number of ThreadPoolExecutors, we can have a global one. Like:

_global_thread_pool = None

class AuthMetadataPlugin(...):
    def __init__(self, ...):
        global _global_thread_pool
        ...
        if _global_thread_pool is None:
            _global_thread_pool = futures.ThreadPoolExecutor()  # Let the system default decide number of max_workers
        self._pool = _global_thread_pool

It solves the spiking number of threads, but not the memory leak issue entirely.

ThreadPoolExecutor is slow and buggy. The thread spawned by it is not daemon thread, but thread with special tags that got clean-up by a registered shutdown handler on interpreter exit. It has caused random hang on exit in gRPC Python.


EDIT: Use class static variable is better.

class AuthMetadataPlugin(...):

    _AUTH_THREAD_POOL = futures.ThreadPoolExecutor()

    def __call__(self, context, callback):
        ...
        future = self._AUTH_THREAD_POOL.submit(self._get_authorization_headers, context)
        ...

@yoshi-automation yoshi-automation added 🚨 This issue needs some love. and removed 🚨 This issue needs some love. labels Mar 15, 2020
busunkim96 added a commit that referenced this issue Mar 16, 2020
This partially addresses #455, by reducing the number of threads
created.
@yoshi-automation yoshi-automation added the 🚨 This issue needs some love. label Mar 16, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
priority: p1 Important issue which blocks shipping the next release. Will be fixed prior to next release. 🚨 This issue needs some love. type: bug Error or flaw in code with unintended results or allowing sub-optimal usage patterns.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants