Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Change cursors batch time formula on presence update #333

Merged
merged 1 commit into from
Sep 18, 2024

Conversation

JoaoDiasAbly
Copy link
Contributor

@JoaoDiasAbly JoaoDiasAbly commented Sep 4, 2024

https://ably.atlassian.net/browse/REA-1868

This PR changes the rate at which we adapt the (client-side) batch time, multiplying the outboundBatchInterval by slots of 100s instead of multiplying this value with the number of presence members.

This change and smoothing of client-side batching interval is incentivised by the addition of server-side batching, which is now enabled by default for cursors channels.

@lmars
Copy link
Member

lmars commented Sep 6, 2024

@JoaoDiasAbly given this is a public repo, please can you add a description of the change directly, since not all people will be able to see the linked ticket.

@@ -105,7 +105,7 @@ export default class Cursors extends EventEmitter<CursorsEventMap> {
const channel = this.getChannel();
const cursorsMembers = await channel.presence.get();
this.cursorBatching.setShouldSend(cursorsMembers.length > 1);
this.cursorBatching.setBatchTime(cursorsMembers.length * this.options.outboundBatchInterval);
this.cursorBatching.setBatchTime(Math.ceil(cursorsMembers.length / 100) * this.options.outboundBatchInterval);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's hard to understand why this is divided by 100 now?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Before we were adjusting with cursorsMembers.length * this.options.outboundBatchInterval, which made the client-side latency climb linearly with the presence set. Since we have server-side batching now we don't need to make it so steep client-side. Therefore, the ticket suggestion was that it could climb with groups of 100s, which makes client-side batching latency grow much slower.

Example assuming outboundBatchInterval=25ms:

  • if #presenceMembers <= 100: batchTime=25ms
  • if #presenceMembers <= 200: batchTime=50ms
  • if #presenceMembers <= 300: batchTime=75ms

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok thanks, makes sense, perhaps put that as a comment in the code?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added as a comment in the code

@JoaoDiasAbly JoaoDiasAbly merged commit 12467dd into main Sep 18, 2024
7 checks passed
@JoaoDiasAbly JoaoDiasAbly deleted the feature/REA-1868-cursors-batching branch September 18, 2024 07:29
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

Successfully merging this pull request may close these issues.

2 participants