Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

improve backpressure mechanism for in-memory batch channel #743

Closed
jordanrfrazier opened this issue Sep 7, 2023 · 0 comments · Fixed by #746
Closed

improve backpressure mechanism for in-memory batch channel #743

jordanrfrazier opened this issue Sep 7, 2023 · 0 comments · Fixed by #746
Labels
enhancement New feature or request

Comments

@jordanrfrazier
Copy link
Collaborator

The in-memory bounded channel is lagging behind during load tests. A couple options:

  1. Block python on send, wait for a space to send a batch to the channel.
  2. Assemble larger batches in the consumer to speed up rate of consumption
  3. Instead of blocking, concatenate batches in the sender to send less frequent, bigger batches
@jordanrfrazier jordanrfrazier added bug Something isn't working enhancement New feature or request and removed bug Something isn't working labels Sep 7, 2023
github-merge-queue bot pushed a commit that referenced this issue Sep 12, 2023
This closes #743

---------

Co-authored-by: Jordan Frazier <jordan.frazier@datastax.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant