Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BeatsTransport _batch_size #93

Closed
cbeaujoin-stellar opened this issue Feb 15, 2024 · 3 comments
Closed

BeatsTransport _batch_size #93

cbeaujoin-stellar opened this issue Feb 15, 2024 · 3 comments

Comments

@cbeaujoin-stellar
Copy link

cbeaujoin-stellar commented Feb 15, 2024

I think the batch_size should not be equal to 10 but should be equal to constants.QUEUED_EVENTS_BATCH_SIZE.

I.E if constants.QUEUED_EVENTS_BATCH_SIZE = 1500, it will generate 1500/10=150 transactions and that will take lot of time.

@eht16
Copy link
Owner

eht16 commented Mar 3, 2024

We could make it configurable more easily.

The higher the batch_size is the more likely will be transmission errors, like timeouts or maybe processing errors on the Logstash side.
While the Beats protocol is designed to handle such problems, I'd still keep the default conservatively low.

What about:

  • making it easier to configure
  • use a dynamic default value like batch_size = max(50, constants.QUEUED_EVENTS_BATCH_SIZE)?

@cbeaujoin-stellar
Copy link
Author

Hi,
Yes it sounds a good trade off.

@eht16 eht16 closed this as completed in be6e30e Apr 13, 2024
@eht16
Copy link
Owner

eht16 commented Apr 13, 2024

@cbeaujoin-stellar I implemented the new setting QUEUED_EVENTS_BEATS_BATCH_SIZE with a simple default. I think this is OK as the batch size is configurable and users can set it as they like.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants