-
-
Notifications
You must be signed in to change notification settings - Fork 854
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Stream response body in ASGITransport #3059
base: master
Are you sure you want to change the base?
Conversation
tests/test_asgi.py
Outdated
@pytest.mark.anyio | ||
async def test_asgi_stream_returns_before_waiting_for_body(): | ||
start_response_body = anyio.Event() | ||
|
||
async def send_response_body_after_event(scope, receive, send): | ||
status = 200 | ||
headers = [(b"content-type", b"text/plain")] | ||
await send( | ||
{"type": "http.response.start", "status": status, "headers": headers} | ||
) | ||
await start_response_body.wait() | ||
await send({"type": "http.response.body", "body": b"body", "more_body": False}) | ||
|
||
async with httpx.AsyncClient(app=send_response_body_after_event) as client: | ||
async with client.stream("GET", "http://www.example.org/") as response: | ||
assert response.status_code == 200 | ||
start_response_body.set() | ||
await response.aread() | ||
assert response.text == "body" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This test makes sense to me yep.
How about the case where we're streaming the response body, and ensuring that we're able to receive it incrementally? Are we able to test that also?
await start_response_body.wait()
await send({"type": "http.response.body", "body": b"1", "more_body": True})
await keep_going.wait()
await send({"type": "http.response.body", "body": b"2", "more_body": True})
await nearly_there.wait()
await send({"type": "http.response.body", "body": b"3", "more_body": False})
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, we should be able to test that, I will update the PR accordingly (I will also take in account the pending deprecation of the app
argument in AsyncClient
in the added tests)
460832c
to
e249563
Compare
e249563
to
672f459
Compare
The PR was updated, with a new |
After working with the version of I have an idea in order to fix that, here is how I think it could work:
I would like your opinion on whether the idea mentioned above should be added to the current PR, or be the object of a new PR. |
Either would be okay with me. |
Hi @jhominal,
Could you please confirm that your current patch will also not work with task_group.start_soon(wrap, partial(self.stream_response, send)) ? Do you know about some working alternatives? I have seen https://pypi.org/project/async-asgi-testclient/ and https://gist.github.com/richardhundt/17dfccb5c1e253f798999fc2b2417d7e, not sure what to think about it. Thanks. |
Hello @souliane I had been working on and off on this issue for a while, in short:
|
Yep, that's feasible. I don't really understand if that's inherent to ASGI, or particular to the interfacing a
Sure. Does it make sense to have a task group running over the lifespan of the transport? That's then well-bounded. |
Is this PR still ongoing? I encountered the same issue. |
@zuoc1993 It is still ongoing yep. There's possibly a discussion to be had around the Transport API, and the limitations of a |
Summary
As part of my job, we needed a variant of
ASGITransport
that supports streaming (as in #2186), and this is my PR to implement that.Something that I am particularly proud of is that this PR was written without having to spawn a new task, with the consequence that it avoids issues related to task groups and context variables.
Checklist
Fixes #2186