Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ValueError when server event is bigger than 128kbytes #11

Closed
spacemanspiff2007 opened this issue Mar 29, 2020 · 5 comments
Closed

ValueError when server event is bigger than 128kbytes #11

spacemanspiff2007 opened this issue Mar 29, 2020 · 5 comments

Comments

@spacemanspiff2007
Copy link

  • Python version: 3.7
  • Operating System: Win7 + Ubuntu
  • aiohttp version: newest

Description

When the Server sends an event that is bigger than ~128kbytes aiohttp throws a ValueError because the line buffer is not big enough.

What I Did

Received an event that is bigger than 128kbytes

client.py

# async for ... in StreamReader only split line by \n
while self._response.status != 204:
    async for line_in_bytes in self._response.content:  # <-- this causes the error
        line = line_in_bytes.decode('utf8')

Exisiting Issues
spacemanspiff2007/HABApp#118
aio-libs/aiohttp#4453

Suggested Workaround:
https://github.com/zalando-incubator/kopf/pull/276/files

@spacemanspiff2007
Copy link
Author

@awarecan could you please take a look at this issue? I've attached a simple workaround. Do you think it can be used?

@awarecan
Copy link
Contributor

I don't like to rewrite the aiohttp stream function just for a corner case, why the monkey patch wasn't working?

@spacemanspiff2007
Copy link
Author

Thank you for your reply.
I tried to change the max size like this, but it doesn't work.

import aiohttp
aiohttp.streams.DEFAULT_LIMIT = 10 * aiohttp.streams.DEFAULT_LIMIT
from aiohttp.client import ClientResponse
from aiohttp_sse_client import client as sse_client

I am not sure, how to properly monkey patch the .content property.
Could you provide me with a hint? Thanks a lot

@alinagol
Copy link

alinagol commented Apr 14, 2020

Hi @awarecan. I would disagree that it is a corner case.

I'm having a similar issue while reading json streams.
As they are separated by \n\n, I can't use iter_chunks function, which is expecting \r\n separator. So, I iterated through response.content, and faced the same error as @spacemanspiff2007.

I also tried to monkey patch in the same way, but it doesn't seem to have any effect. The only solution i found is:

s = response.content
s._high_water = 2 ** 18

That doesn't look very neat, and it would be great to have a parameter to choose iterator or change the limit.

@awarecan
Copy link
Contributor

awarecan commented Oct 20, 2020

It has been resolved by aio-libs/aiohttp#5065

A read_bufsize parameter can pass in EventSource constructor since aiohttp v3.7

EventSource(url, read_bufsize=2**18)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants