Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Parallel hub invocations #23535

Merged
merged 21 commits into from
Aug 19, 2020
Merged

Parallel hub invocations #23535

merged 21 commits into from
Aug 19, 2020

Conversation

BrennanConroy
Copy link
Member

@BrennanConroy BrennanConroy commented Jun 30, 2020

Fixes #5351

HubConnectionContextOptions:
public int MaximumParallelInvocations { get; set; } = 1;

HubOptions:
public int MaxParallelInvocationsPerClient { get; set; }

@BrennanConroy BrennanConroy added area-signalr Includes: SignalR clients and servers api-suggestion Early API idea and discussion, it is NOT ready for implementation labels Jun 30, 2020
@BrennanConroy BrennanConroy added this to the 5.0.0-preview8 milestone Jun 30, 2020
@davidfowl
Copy link
Member

I would prefer to see a bounded channel in the dispatch loop.

@davidfowl
Copy link
Member

Spoke to @BrennanConroy and we're going to do a bounded channel with N loops.

@BrennanConroy
Copy link
Member Author

we're going to do a bounded channel with N loops.

*try

@BrennanConroy
Copy link
Member Author

BoundedChannel approach has some pros and cons:

Pros:

  • Hub messages are processed completely in parallel, other approaches would run the current hub message inline until it went async

Cons:

  • Hub messages are processed in parallel
    • Tests specifically turn on inline completions for pipelines to allow us to test very specific things, you could argue this is a test issue and they likely can be modified to work
    • Some hub message processing is assumed to run before another hub message is parsed in some cases
      • A streaming message needs to register a stream internally before the stream item message starts getting processed, but parallel hub message processing creates a race here
  • (Max parallelism + 2) hub messages are parsed before backpressure is applied
    • Timeout tests were assuming if a hub message is being invoked then the timeout is paused, these tests can be modified to send an additional 2 messages to workaround this

@BrennanConroy BrennanConroy marked this pull request as ready for review August 3, 2020 16:25
@BrennanConroy BrennanConroy added api-ready-for-review API is ready for formal API review - https://github.com/dotnet/apireviews and removed api-suggestion Early API idea and discussion, it is NOT ready for implementation labels Aug 16, 2020
@ghost
Copy link

ghost commented Aug 16, 2020

Thank you for submitting this for API review. This will be reviewed by @dotnet/aspnet-api-review at the next meeting of the ASP.NET Core API Review group. Please ensure you take a look at the API review process documentation and ensure that:

  • The PR contains changes to the reference-assembly that describe the API change. Or, you have included a snippet of reference-assembly-style code that illustrates the API change.
  • The PR describes the impact to users, both positive (useful new APIs) and negative (breaking changes).
  • Someone is assigned to "champion" this change in the meeting, and they understand the impact and design of the change.

{
internal static class SemaphoreSlimExtensions
{
public static async Task<Task> RunAsync<TState>(this SemaphoreSlim semaphoreSlim, Func<TState, Task> callback, TState state)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I just realized this is going to allocate... Maybe we do a fast path like we do when we write a hub message

@pranavkm pranavkm added api-approved API was approved in API review, it can be implemented and removed api-ready-for-review API is ready for formal API review - https://github.com/dotnet/apireviews labels Aug 17, 2020
@BrennanConroy BrennanConroy changed the base branch from master to release/5.0 August 17, 2020 21:44
@BrennanConroy
Copy link
Member Author

/azp run

@azure-pipelines
Copy link

Azure Pipelines successfully started running 2 pipeline(s).

@BrennanConroy BrennanConroy added the tell-mode Indicates a PR which is being merged during tell-mode label Aug 18, 2020
return connection.ActiveInvocationLimit.RunAsync(state =>
{
var (dispatcher, descriptor, connection, invocationMessage) = state;
return dispatcher.Invoke(descriptor, connection, invocationMessage, isStreamResponse: false, isStreamCall: false);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can the above call to Invoke ever throw?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not really. I think if you wrote a custom auth handler and threw in there it could throw. We could wrap that so it closes the connection with an error, but would prefer it in a different change.

}

[Fact]
public async Task StreamInvocationsDoNotBlockOtherInvocations()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does this mean there's still no way to limit the number of concurrent streaming invocations per client?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, we talked about this

@BrennanConroy BrennanConroy merged commit 85bde1d into release/5.0 Aug 19, 2020
@BrennanConroy BrennanConroy deleted the brecon/parallel branch August 19, 2020 21:58
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
api-approved API was approved in API review, it can be implemented area-signalr Includes: SignalR clients and servers tell-mode Indicates a PR which is being merged during tell-mode
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Non-serialized hub invocations
4 participants