Skip to content

Commit

Permalink
fix(cli): HTTP streamed task output watching
Browse files Browse the repository at this point in the history
When watching streamed tasks output on Fatbuildrweb HTTP REST API,
fatbuildrctl crashed due to unexpected stopped iterator. Actually, the
python requests Response.iter_content() was not properly used and with
recent change for better integration of urllib3, a regression occured.
The chunk size of the generator cannot be changed dynamically while
reading data. The data is now placed into a buffer, and the buffer is
filled with the next chunks when new data is required.

fix #138
  • Loading branch information
rezib committed Aug 28, 2023
1 parent 71e636d commit 62a3ff8
Show file tree
Hide file tree
Showing 2 changed files with 22 additions and 2 deletions.
1 change: 1 addition & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -68,6 +68,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
internal error and HTTP/500 (#136).
- Report meaningful error message instead of generic HTTP/500 internal error
when authenticating with JWT token on unexisting remote HTTP instance (#137).
- Fix crash with when watching streamed tasks output with HTTP REST API (#138).

## [2.0.0] - 2023-05-05

Expand Down
23 changes: 21 additions & 2 deletions fatbuildr/console/client.py
Original file line number Diff line number Diff line change
Expand Up @@ -251,7 +251,7 @@ def console_reader(io, binary):
on the socket. If binary argument is False, ConsoleMessage are generated as
objects. Otherwise they are generated in bytes.
This function is supposed to be called for archives tasks only."""
This function is supposed to be called for archived tasks only."""
with open(io.journal.path, 'rb') as fh:
yield from _console_generator(binary, fd=fh.fileno())

Expand All @@ -260,8 +260,27 @@ def console_http_client(response):
"""Reads and generates the ConsoleMessage available in the given HTTP
response object."""

iterator = response.iter_content(chunk_size=32)
buffer = next(iterator)

def reader(size):
return next(response.iter_content(chunk_size=size))
# The chunk size of the response content generator cannot be changed
# while reading streamed data. The data are placed into a buffer.
# Depending on the required read size, the data is extracted from the
# beginning of the buffer or read from the generator until the expected
# size is reached.
nonlocal buffer
chunk = bytes()
while size:
if len(buffer) >= size:
chunk += buffer[:size]
buffer = buffer[size:]
size = 0
else:
chunk += buffer
size -= len(buffer)
buffer = next(iterator)
return chunk

yield from _console_generator(False, reader=reader)

Expand Down

0 comments on commit 62a3ff8

Please sign in to comment.