You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The higher the batch_size is the more likely will be transmission errors, like timeouts or maybe processing errors on the Logstash side.
While the Beats protocol is designed to handle such problems, I'd still keep the default conservatively low.
What about:
making it easier to configure
use a dynamic default value like batch_size = max(50, constants.QUEUED_EVENTS_BATCH_SIZE)?
@cbeaujoin-stellar I implemented the new setting QUEUED_EVENTS_BEATS_BATCH_SIZE with a simple default. I think this is OK as the batch size is configurable and users can set it as they like.
I think the batch_size should not be equal to 10 but should be equal to constants.QUEUED_EVENTS_BATCH_SIZE.
I.E if constants.QUEUED_EVENTS_BATCH_SIZE = 1500, it will generate 1500/10=150 transactions and that will take lot of time.
The text was updated successfully, but these errors were encountered: