Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

HTTP/2 stress test TaskCanceledException when client hasn't cancelled #42472

Closed
ManickaP opened this issue Sep 18, 2020 · 5 comments · Fixed by #54625
Closed

HTTP/2 stress test TaskCanceledException when client hasn't cancelled #42472

ManickaP opened this issue Sep 18, 2020 · 5 comments · Fixed by #54625
Assignees
Milestone

Comments

@ManickaP
Copy link
Member

When a server receives DATA frame after RST (known issue #1510, probably caused by the same problem as #42200) it makes the connection faulted and sends GO_AWAY. On the client, the connections is getting closed, all the streams getting reset, triggering Http2Stream._requestBodyCancellationSource. This CTS will cancel ongoing request body with TaskCancelledException eventually propagating out of SendAsync. This is magnified by #38774 that tried to propagate errors from faulty request content to the user, therefore is mostly visible in POST ExpectContinue stress test operation.

@Dotnet-GitSync-Bot Dotnet-GitSync-Bot added area-System.Net.Http untriaged New issue has not been triaged by the area owner labels Sep 18, 2020
@ghost
Copy link

ghost commented Sep 18, 2020

Tagging subscribers to this area: @dotnet/ncl
See info in area-owners.md if you want to be subscribed.

@ManickaP ManickaP removed the untriaged New issue has not been triaged by the area owner label Sep 18, 2020
@ManickaP ManickaP added this to the 6.0.0 milestone Sep 18, 2020
@geoffkizer
Copy link
Contributor

Yeah, this is ugly. We should be reporting some sort of exception here that indicates the server sent us a GOAWAY that killed these requests. Otherwise it's super confusing.

@BrennanConroy
Copy link
Member

I believe there is another case where you can get System.Threading.Tasks.TaskCanceledException : The request was canceled due to the configured HttpClient.Timeout of 100 seconds elapsing. unexpectedly.

The client sends the headers first, after that it kicks off two tasks, one for sending the data frame and one for reading response headers. An RST_FRAME can be received by the client due to a stream limit being hit on the server which should cause a retry but ReadResponseHeadersAsync will call Cancel() which cancels _requestBodyCancellationSource which will cause the previously mentioned data frame task to fail. This will race with the reading response headers task and you'll get either a TaskCanceledException which will cause the retry to not happen, or you'll get a HttpRequestException which will allow a retry.

@karelz
Copy link
Member

karelz commented Jun 22, 2021

@alnikola is it similar to #42200 as mentioned above?

@alnikola
Copy link
Contributor

The root cause of this issue is the same as #42200, but it seems this is more about a proper error reporting than fixing the root cause itself.

@alnikola alnikola self-assigned this Jun 23, 2021
@ghost ghost added the in-pr There is an active PR which will close this issue when it is merged label Jun 23, 2021
alnikola added a commit that referenced this issue Jul 2, 2021
)

If server sends `GO_AWAY` to client, `Http2Connection` handles it and sets a correct `Http2ConnectionException` to `_resetException` field followed by resetting all active Http2Streams. Each of these streams is expected to rethrow that `_resetException` to communicate the original protocol error to the application code. However, the method `Http2Stream.SendDataAsync` currently doesn't take into account that field, thus when it gets cancelled as part of a stream reset it just throws `OperationCanceledException` which doesn't contain any details. This PR fixes that and makes `Http2Stream.SendDataAsync` throw the original `Http2ConnectionException` wrapped by `IOException`.

Fixes #42472
@ghost ghost removed the in-pr There is an active PR which will close this issue when it is merged label Jul 2, 2021
@ghost ghost locked as resolved and limited conversation to collaborators Aug 1, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

6 participants