-
Notifications
You must be signed in to change notification settings - Fork 3.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Strange behaviour when stream PublisherID is too long #12499
Comments
deadtrickster
changed the title
Strange behaviour when stream PublisherID is too long
DRAFT: Strange behaviour when stream PublisherID is too long
Oct 9, 2024
deadtrickster
changed the title
DRAFT: Strange behaviour when stream PublisherID is too long
Strange behaviour when stream PublisherID is too long
Oct 9, 2024
acogoluegnes
added a commit
that referenced
this issue
Oct 10, 2024
acogoluegnes
added a commit
that referenced
this issue
Oct 14, 2024
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Describe the bug
Two bugs in one:
First:
PublisherID length not checked by stream plugin front end. It happily serializes it and PublisherID ends up in a
osiris_tracking:add
call which checks theTrkId lenght. And then crashes with function_clause.Also I get no slap from python client. It happily keeps running connection. While when say rabbit is dead or gen_batch_server is busy from breakpoints I get connection timeout
Second (probably to be moved to osiris directly):
I also got this when sending a smaller PublisherId of 256 bytes:
Reproduction steps
uncomment the third send (length of 256 and constant) and then second to get writer restarted. restarting Rabbit also gives same result
Expected behavior
Stream writer doesn't crash, Python client yells at me probably with proper errorcode
Additional context
The text was updated successfully, but these errors were encountered: