Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

asyncio.exceptions.LimitOverrunError #56

Closed
mohnishkodnani opened this issue Nov 9, 2021 · 7 comments
Closed

asyncio.exceptions.LimitOverrunError #56

mohnishkodnani opened this issue Nov 9, 2021 · 7 comments

Comments

@mohnishkodnani
Copy link

  • Yet Another Cron version: 0.14.0
  • Python version: 3.9
  • Operating System: python 3.9-slim-buster docker image.

Description

When sending multiple requests.get API calls , suddenly one call errors out ( same call everytime ) with the following message.

[touchstone stderr] < Host: abc.com:8080
[touchstone stderr] < User-Agent: python-requests/2.26.0
[touchstone stderr] < Accept-Encoding: gzip, deflate
[touchstone stderr] < Accept: */*
[touchstone stderr] < Connection: keep-alive
[touchstone stderr] < 
[touchstone stderr] 
[touchstone stderr] > HTTP/1.1 200 OK
[touchstone stderr] > Content-Type: application/json
[touchstone stderr] > content-encoding: gzip
[touchstone stderr] > transfer-encoding: chunked
[touchstone stderr] > 
ERROR:yacron:please report this as a bug (2)
Traceback (most recent call last):
  File "/usr/local/lib/python3.9/asyncio/streams.py", line 540, in readline
    line = await self.readuntil(sep)
  File "/usr/local/lib/python3.9/asyncio/streams.py", line 635, in readuntil
    raise exceptions.LimitOverrunError(
asyncio.exceptions.LimitOverrunError: Separator is found, but chunk is longer than limit

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.9/site-packages/yacron/cron.py", line 356, in _wait_for_running_jobs
    task.result()
  File "/usr/local/lib/python3.9/site-packages/yacron/job.py", line 420, in wait
    await self._read_job_streams()
  File "/usr/local/lib/python3.9/site-packages/yacron/job.py", line 427, in _read_job_streams
    ) = await self._stderr_reader.join()
  File "/usr/local/lib/python3.9/site-packages/yacron/job.py", line 76, in join
    await self._reader
  File "/usr/local/lib/python3.9/site-packages/yacron/job.py", line 54, in _read
    line = (await stream.readline()).decode("utf-8")
  File "/usr/local/lib/python3.9/asyncio/streams.py", line 549, in readline
    raise ValueError(e.args[0])
ValueError: Separator is found, but chunk is longer than limit```

Describe what you were trying to get done.
Tell us what happened, what went wrong, and what you expected to happen.
When I run it from outside without "yacron" it works just fine. 

### What I Did

Paste the command(s) you ran and the output.
If there was a crash, please include the traceback here.

@gjcarneiro
Copy link
Owner

From what I could gather here, this might be due to the fact that this subprocess that yacron is trying to run is generating a line that is over 65k long.

I'll have to think the best way to handle it. In the mean time, you might want to avoid having this process generate such long lines. Definitely a bug, though, thanks for the report!

@mohnishkodnani
Copy link
Author

Hi,
I saw the fix, thanks a lot. I had a question, is it possible to not ignore the line but get more than the chunk by specifying a bigger limit. In our case we have some very big JSON coming as response.

@gjcarneiro
Copy link
Owner

But what are you doing with that output?.. If you send to Sentry, it will ignore it, for the same reason, too big. If you send as email, I'm not sure, but if it's big it might get into similar problems.

If you just want to log it to the container's stdout, you can specify captureStdout: false and yacron will just let it pass through to stdout unchanged.

While it's technically possible to change the line limit, I'm not quite convinced it is really that important...

@mohnishkodnani
Copy link
Author

mohnishkodnani commented Nov 10, 2021 via email

@gjcarneiro
Copy link
Owner

Parse that output from where? You ask yacron to send as email and you parse the output from an email?

OK, fine, if you insist, I'll reopen to add a config option for this limit. Seems a convoluted use case, but fine, it's not that hard to make the limit configurable.

@gjcarneiro gjcarneiro reopened this Nov 10, 2021
@mohnishkodnani
Copy link
Author

yacron runs a job ( this is a python job too ), this job queries a service, fetches the json and parses it and puts in a DB.
If I run the job from outside yacron it works fine, but when invoked within yacron, I get this error.

@gjcarneiro
Copy link
Owner

Sure, but then it sounds like printing the json to stdout is just for debugging. In which case ignoring the very long line should have no effect in the ability of the script to do its job of parsing the json and storing in the DB.

My claim is that it is possibly ignoring this line is fine. I'll add the config option when I have more time, but it is possible that you are assuming that the job fails when in fact it succeeds?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants