-
-
Notifications
You must be signed in to change notification settings - Fork 2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Memory leak when doing https request #133
Comments
Sorry, I cannot reproduce the problem. |
Here is the code: import asyncio
import aiohttp
import sys
if __name__ == '__main__':
@asyncio.coroutine
def fetch(url):
try:
response = yield from aiohttp.request('GET', url, allow_redirects=True, max_redirects=10)
if response:
print("Have response for %s" % response)
except:
print('Exception')
asyncio.async(fetch(url))
concurrentFetches = int(sys.argv[1])
url = sys.argv[2]
for i in range(concurrentFetches):
asyncio.async(fetch(url))
asyncio.get_event_loop().run_forever() I run it in a centos6 docker and limit the memory to 2g. Here are the arguments below: python memLeakTest.py 100 https://www.paypal.com/ It works for about 3-6 minutes, very fast it gets to 2g memory usage and after a few minutes the docker crashes. When I run in with non https site the memory stays between 50-200mb depends on the site. Please tell me if there is anything more I can do to help. |
@yanivhdd |
You are right, |
I experienced memory leaks issues and narrowed it down to when I do https requests, for example:
response = yield from aiohttp.request('GET', 'https://www.similartech.com',
allow_redirects=True,
max_redirects=10)
or
response = yield from aiohttp.request('GET', 'https://www.paypal.com',
allow_redirects=True,
max_redirects=10)
I"m doing 1000 concurrent fetches in an infinite loop and see the memory goes up to 2gig in less then 1min.
When I do the same loop with url with http scheme I don't see the memory goes up at all.
The text was updated successfully, but these errors were encountered: