-
Notifications
You must be signed in to change notification settings - Fork 1.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
consumer: batch record number >2*math.MaxUint16 leads to error decoding packet: invalid array length #1225
Comments
This comment was marked as outdated.
This comment was marked as outdated.
This comment was marked as duplicate.
This comment was marked as duplicate.
1 similar comment
This comment was marked as duplicate.
This comment was marked as duplicate.
I also meet this proble when use kafka-exporter to scrape kafka metrics, and logging: Cannot get consumer group: kafka: error decoding packet: invalid array length |
@yehaotong that's a different issue you were seeing — invalid array length is a general encoding/decoding error This issue is specifically talking about the batch record numbers:
|
Fixes IBM#1225 Signed-off-by: Mark Hindess <mark.hindess@gmail.com>
Thank you for taking the time to raise this issue. However, it has not had any activity on it in the past 90 days and will be closed in 30 days if no updates occur. |
Versions
Sarama Version:
1.18
Kafka Version:
2.0.0
Go Version:
1.11
Configuration
What configuration values are you using for Sarama and Kafka?
sarama
Logs
When filing an issue please provide logs from Sarama and Kafka if at all
possible. You can set
sarama.Logger
to alog.Logger
to capture Sarama debugoutput.
sarama
maybe cause by here https://github.com/Shopify/sarama/blob/master/real_decoder.go#L88
Problem Description
while consume, meet the
invalid array length
error.after change, the producer(set config.Producer.Flush.MaxMessages = math.MaxUint16), the consumer would'not meet the invalid array length.
i print the tmp at maybe cause by here https://github.com/Shopify/sarama/blob/master/real_decoder.go#L88, it will not bigger than the MaxMessages config in producer
i think the problem is using the default
Flush
config, the batch record number may bigger than 2x math.MaxUint16, and when consuming, sarama treat it invalid array length if record number > 2 x math.MaxUint16.
if kafka has no limit for batch records num, can we remove the length limit?
or better to change the default config of Flush to limit MaxMessages
The text was updated successfully, but these errors were encountered: