Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix #1387 #1400

Merged
merged 1 commit into from
Nov 29, 2020
Merged

fix #1387 #1400

merged 1 commit into from
Nov 29, 2020

Conversation

slingamn
Copy link
Member

I measured a speedup of around 20% on the chanflood benchmark from this.

One area where this actually makes things worse: we're using longer batch IDs now when relaying multilines, because they're shared globally (due to caching) so we need to make them longer for uniqueness. This could conceivably reduce performance in some workloads.

The new bitset routines are not used.

Instead of building a new serialized message for each recipient,
try to cache them.
@slingamn slingamn added this to the v2.5 milestone Nov 27, 2020
@slingamn
Copy link
Member Author

It's probably perfectly safe to cut the batch IDs in half (going from 26 bytes to 13 bytes on the wire, and from 128 to 64 bits of entropy).

@slingamn slingamn merged commit 0fcaf77 into ergochat:master Nov 29, 2020
@slingamn slingamn deleted the issue1387_messagecaching.4 branch March 5, 2021 02:55
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant