Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

consumer: batch record number >2*math.MaxUint16 leads to error decoding packet: invalid array length #1225

Closed
july2993 opened this issue Nov 27, 2018 · 6 comments
Labels
bug needs-investigation Issues that require followup from maintainers stale Issues and pull requests without any recent activity

Comments

@july2993
Copy link

Versions

Sarama Version:
1.18
Kafka Version:
2.0.0
Go Version:
1.11

Configuration

What configuration values are you using for Sarama and Kafka?
sarama

    sarama.MaxResponseSize = 1 << 30
    sarama.MaxRequestSize = 1 << 30
    config.Producer.MaxMessageBytes = 1 << 30
    config.Producer.Partitioner = sarama.NewManualPartitioner
    config.Producer.RequiredAcks = sarama.WaitForAll
Logs

When filing an issue please provide logs from Sarama and Kafka if at all
possible. You can set sarama.Logger to a log.Logger to capture Sarama debug
output.

sarama
maybe cause by here https://github.com/Shopify/sarama/blob/master/real_decoder.go#L88

error decoding packet        : invalid array length
Problem Description

while consume, meet the invalid array length error.

after change, the producer(set config.Producer.Flush.MaxMessages = math.MaxUint16), the consumer would'not meet the invalid array length.

i print the tmp at maybe cause by here https://github.com/Shopify/sarama/blob/master/real_decoder.go#L88, it will not bigger than the MaxMessages config in producer

i think the problem is using the default Flush config, the batch record number may bigger than 2
x math.MaxUint16, and when consuming, sarama treat it invalid array length if record number > 2 x math.MaxUint16.

if kafka has no limit for batch records num, can we remove the length limit?
or better to change the default config of Flush to limit MaxMessages

@ghost

This comment was marked as outdated.

@ghost ghost added the stale Issues and pull requests without any recent activity label Feb 21, 2020
@ghost ghost closed this as completed Mar 22, 2020
@Sidray-Infinity

This comment was marked as duplicate.

1 similar comment
@yehaotong

This comment was marked as duplicate.

@yehaotong
Copy link

I also meet this proble when use kafka-exporter to scrape kafka metrics, and logging: Cannot get consumer group: kafka: error decoding packet: invalid array length

@dnwe dnwe reopened this Jul 25, 2023
@dnwe dnwe removed the stale Issues and pull requests without any recent activity label Jul 25, 2023
@dnwe
Copy link
Collaborator

dnwe commented Aug 21, 2023

@yehaotong that's a different issue you were seeing — invalid array length is a general encoding/decoding error

This issue is specifically talking about the batch record numbers:

i think the problem is using the default Flush config, the batch record number may bigger than 2
x math.MaxUint16, and when consuming, sarama treat it invalid array length if record number > 2 x math.MaxUint16.

@dnwe dnwe added bug needs-investigation Issues that require followup from maintainers and removed bug :-( enhancement labels Aug 21, 2023
@dnwe dnwe changed the title consumer meet error: error decoding packet: invalid array length consumer: batch record number >2*math.MaxUint16 leads to error decoding packet: invalid array length Aug 21, 2023
hindessm added a commit to hindessm/sarama that referenced this issue Aug 29, 2023
Fixes IBM#1225

Signed-off-by: Mark Hindess <mark.hindess@gmail.com>
dnwe pushed a commit that referenced this issue Aug 29, 2023
…2628)

Fixes #1225

Signed-off-by: Mark Hindess <mark.hindess@gmail.com>
@dnwe dnwe reopened this Aug 29, 2023
Copy link

Thank you for taking the time to raise this issue. However, it has not had any activity on it in the past 90 days and will be closed in 30 days if no updates occur.
Please check if the main branch has already resolved the issue since it was raised. If you believe the issue is still valid and you would like input from the maintainers then please comment to ask for it to be reviewed.

@github-actions github-actions bot added the stale Issues and pull requests without any recent activity label Dec 12, 2023
@github-actions github-actions bot closed this as not planned Won't fix, can't repro, duplicate, stale Jan 11, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug needs-investigation Issues that require followup from maintainers stale Issues and pull requests without any recent activity
Projects
None yet
Development

No branches or pull requests

5 participants