Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory leak when doing https request #133

Closed
ghost opened this issue Aug 2, 2014 · 4 comments
Closed

Memory leak when doing https request #133

ghost opened this issue Aug 2, 2014 · 4 comments

Comments

@ghost
Copy link

ghost commented Aug 2, 2014

I experienced memory leaks issues and narrowed it down to when I do https requests, for example:

response = yield from aiohttp.request('GET', 'https://www.similartech.com',
allow_redirects=True,
max_redirects=10)
or

response = yield from aiohttp.request('GET', 'https://www.paypal.com',
allow_redirects=True,
max_redirects=10)

I"m doing 1000 concurrent fetches in an infinite loop and see the memory goes up to 2gig in less then 1min.

When I do the same loop with url with http scheme I don't see the memory goes up at all.

@asvetlov
Copy link
Member

asvetlov commented Aug 2, 2014

Sorry, I cannot reproduce the problem.
Can you display full test code?

@ghost
Copy link
Author

ghost commented Aug 3, 2014

Here is the code:

import asyncio
import aiohttp
import sys

if __name__ == '__main__':

    @asyncio.coroutine
    def fetch(url):
        try:
            response = yield from aiohttp.request('GET', url, allow_redirects=True, max_redirects=10)

            if response:
                print("Have response for %s" % response)
        except:
            print('Exception')

        asyncio.async(fetch(url))

    concurrentFetches = int(sys.argv[1])
    url = sys.argv[2]

    for i in range(concurrentFetches):
        asyncio.async(fetch(url))

    asyncio.get_event_loop().run_forever()

I run it in a centos6 docker and limit the memory to 2g.

Here are the arguments below:

python memLeakTest.py 100 https://www.paypal.com/

It works for about 3-6 minutes, very fast it gets to 2g memory usage and after a few minutes the docker crashes.

When I run in with non https site the memory stays between 50-200mb depends on the site.

Please tell me if there is anything more I can do to help.

@kxepal
Copy link
Member

kxepal commented Aug 3, 2014

@yanivhdd
yield from aiohttp.request call returns response object, but doesn't reads the content - you should call yield from resp.read() or yield from resp.release() to read the response data to your variable or to void respectively. Otherwise the response data accumulates within internal buffers, but since you never read it this causes memory leak.

@ghost
Copy link
Author

ghost commented Aug 3, 2014

You are right,
sorry for wasting your time :)

This issue was closed.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants