Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Regression with Avro producer in 0.18.0 as compared to 0.16.0 #761

Closed
timtebeek opened this issue Jul 16, 2021 · 4 comments · Fixed by #776
Closed

Regression with Avro producer in 0.18.0 as compared to 0.16.0 #761

timtebeek opened this issue Jul 16, 2021 · 4 comments · Fixed by #776
Labels
backend Need a backend update bug Something isn't working topic data Kafka Topic data

Comments

@timtebeek
Copy link
Contributor

Similar perhaps to #757, but with slightly different characteristics. Noticed after switching from 0.16.0 to 0.18.0 recently.

So as a bit of background: we use org.springframework.kafka.listener.DeadLetterPublishingRecoverer, so any messages that fail handling in our Kafka consumers, are pushed onto a dead letter topic with the exact same schema and body. Then, once the underlying handling issue is fixed, I used to copy-paste the key & value from the dead letter topic as viewed through AKHQ into the original topic producer form in AKHQ, selecting the appropriate value schema.

After switching to 0.18.0 I noticed I can no longer directly copy&paste the value for our Avro schemas, due to small incompatibilities in the (de)serializers used.

  1. For instance, we have a timestamp-millis Avro field
{
  "name": "orderDateTime",
  "type": {
    "type": "long",
    "logicalType": "timestamp-millis"
  }
},

The viewer seems to convert this to an Instant: https://github.com/tchiotludo/akhq/blob/0.18.0/src/main/java/org/akhq/utils/AvroDeserializer.java#L150
Which is rendered in the form of 2011-12-03T10:15:30Z

However, the producer form seems to use a slightly different date time format: https://github.com/tchiotludo/akhq/blob/0.18.0/src/main/java/org/akhq/utils/AvroSerializer.java#L44
Notice the additional 3 digits for microseconds rather than milliseconds.

  1. We also have a field that's an array of records, with a default value of an empty array:
{
"name": "services",
"type": {
	"type": "array",
	"items": {
		"type": "record",
		"name": "Service",
		"fields": [...omitted...]
	}
},
"default": []
},

When empty the field is omitted in the viewer, but it's required (as empty array) in the producer, even though there's a default value for the field.

Both issues block us from copy&pasting from the view of a topic with Avro content to the producer form of a topic with the same Avro schema, with some tedious and error-prone manual intervention needed when we propagate from our DLT topic to our original topic.

Could the formatting & requirements for the viewer & producer be harmonized to return to a quick way to copy-paste content between topics?

@timtebeek
Copy link
Contributor Author

@tchiotludo were you planning on a release any time soon? This issue in particular makes it hard for us to recover records pushed onto a dead letter topic, do a release would be much appreciated.

And on that note: have you considered adopting JReleaser? Seems like that could make the release flow easier. I'd be willing to help out if needed!

@tchiotludo
Copy link
Owner

I will release next week I think 😄

I never look at JReleaser, but for now the release is not so hard, I just tag a commit and the github actions handle all the things ;)

@tchiotludo
Copy link
Owner

tchiotludo commented Oct 29, 2021

Lot of weeks / month later, release happen 😄

@timtebeek
Copy link
Contributor Author

Haha, glad to see the new release, congrats!
Already pinned in personal project & at work.
Plus the new docs look great; big improvement! :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
backend Need a backend update bug Something isn't working topic data Kafka Topic data
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants