This is an environment for testing Kafka Connect’s error handling behaviour. It’s vanilla Confluent Platform 5.1 plus:
-
Three test data producers
-
json-corrupt-producer
/json-producer
- uses kafkacat to send JSON totest_topic_json
-
avro-producer
- useskafka-avro-console-producer
to send JSON totest_topic_avro
-
-
jmxtrans / InfluxDB / Grafana - for extracting/storing/visualising JMX metrics from Kafka Connect
-
Elasticsearch for testing Sink behaviour
Each connector’s definition is available in the main text of the supporting article
Each producer is a separate Docker container that will run once. To run them again run
docker-compose start json-producer
docker-compose start json-corrupt-producer
docker-compose start avro-producer
This command should create the InfluxDB data source in Grafana:
curl --user admin:admin -X POST http://localhost:3000/api/datasources -H "Content-Type: application/json" -d '{"orgId":1,"name":"InfluxDB","type":"influxdb","typeLogoUrl":"","access":"proxy","url":"http://influxdb:8086","password":"","user":"","database":"influx","basicAuth":false,"basicAuthUser":"","basicAuthPassword":"","withCredentials":false,"isDefault":true,"jsonData":{"keepCookies":[]},"secureJsonFields":{},"version":2,"readOnly":false}'
Manually import the JSON dashboard definition: config/grafana-dashboard_by_connector.json