-
Notifications
You must be signed in to change notification settings - Fork 213
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Debezium CDC Connector to send Events to Kafka-Enabled Event Hub #53
Comments
This is interesting. What are these fields?
I don't have time at the moment to dig into the connector docs unfortunately, but we can work through these together since you've used the connector previously. |
I've been able to configure it to make it almost working, but I now have some problems with authentication:
|
I've been able to have Debezium correctly connect to Eventhub Kafka, and create the topics. I'm testing Debezium using the docker image provided in the Debezium Tutorial. This is the .yaml I used to make it work:
Note the I had to used a double $ character to make sure it was escaped property (This is needed only if running Debezium using the provided docker image I guess) I then registered the connector for SQL Server (more precisely Azure SQL MI) using the followin JSON:
The issue I'm having right now is this error message:
Which seems to be related to the fact that the actual Kafka endpoint doesn't support 100% kafka features. Any known workaround for this? |
Found a workaround looking in the source code: it is possible to use something else other then Kafka to store schema changes. For example they can be stored in memory using the MemoryDatabaseHistory class. Not sure how useful this could be, but at least make the whole thing working! Just add this to the JSON configuration:
|
@yorek are you saying that you got kafka connect / debezium to make the topics for the watched tables before adding the
You mean the database table topics? or the system topics that are created on connect's startup? |
I was referring to the debezium "system" topics that are created when debezium starts. It's been a while now, so not sure if this still works with current version of Debezium |
@yorek This is really helpful. |
This no longer seems to work with the current version of debezium/connect. The producers and consumers repeatedly disconnect. |
Did you find a solution to this issue netchex-tony? |
Description
Data replication using Kafka Connect and Kafka-enabled Event Hub. We have it working with Confluent Kafka Broker and Confluent Kafka Connect. We are having issues when we replace the Kafka Broker with Event Hub. We are not getting any events from the CDC connector to Event Hub. We see the control topics being created but we do not see the CDC topic for table get created.
It does work with the File Sync connector
How to reproduce
Has it worked previously?
No
Checklist
IMPORTANT: We will close issues where the checklist has not been completed.
Please provide the following information:
<REPLACE with e.g., Java quickstart>
Confluent 5.2.1
<REPLACE with e.g., auto.reset.offset=earliest, ..>
(do not include your connection string or SAS Key)Producer
<REPLACE with e.g., partitionID=3, groupID=consumer-group>
<REPLACE wtih e.g., Nov 7 2018 - 17:15:01 UTC>
<REPLACE with e.g., clientID=kafka-client>
<REPLACE with e.g., Willing/able to send scenario to repro issue>
RHEL latest
Anish => Red Hat Enterprise Linux Server release 7.6 (Maipo)Logs with Debugging enabled
No errors in the logs. Its connected to the event hub and created the default tables like connect-cluster-configs/connect-cluster-offsets/connect-cluster-status. Connector is not even trying to connect to the SQL server. As per Debezium community support, there is no proven solution for CDC connector to connect to the event hub
Kafka Client Configuration
group.id=connect-cluster-group_db2
bootstrap.servers=.servicebus.windows.net:9093
key.converter=org.apache.kafka.connect.json.JsonConverter
value.converter=org.apache.kafka.connect.json.JsonConverter
key.converter = io.confluent.connect.avro.AvroConverter
value.converter = io.confluent.connect.avro.AvroConverter
#internal.key.converter=org.apache.kafka.connect.json.JsonConverter
#internal.value.converter=org.apache.kafka.connect.json.JsonConverter
key.converter.schemas.enable=false
value.converter.schemas.enable=false
offset.storage.file.filename=/tmp/eventhub.offsets
offset.flush.interval.ms=10000
#key.serializer=org.apache.kafka.common.serialization.StringSerializer
#value.serializer=org.apache.kafka.common.serialization.StringSerializer
transforms=unwrap
transforms.unwrap.type=io.debezium.transforms.UnwrapFromEnvelope
transforms.unwrap.drop.tombstones=false
config.storage.topic=connect-cluster-configs_db2
offset.storage.topic=connect-cluster-offsets_db2
status.storage.topic=connect-cluster-status_db2
config.storage.replication.factor=1
offset.storage.replication.factor=1
status.storage.replication.factor=1
name=kafka-poc
connector.class=io.debezium.connector.sqlserver.SqlServerConnector
tasks.max=1
database.hostname=
database.port=1433
database.user=
database.password=
database.dbname=INTCMSCINT
database.history.kafka.topic=history_kafka_poc
database.server.name=
database.history.kafka.bootstrap.servers=.servicebus.windows.net:9093
table.whitelist=Kafka_test.EMPLOYER_testCDC
#snapshot.mode=initial_schema_only
plugin.path=/opt/kafka/confluent-5.2.1/share/java
rest.port=8090
#required EH Kafka security settings
security.protocol=SASL_SSL
sasl.mechanism=PLAIN
sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="$ConnectionString" password="Endpoint=sb://.servicebus.windows.net/;SharedAccessKeyName=RootManageSharedAccessKey;SharedAccessKey=**********";
producer.security.protocol=SASL_SSL
producer.sasl.mechanism=PLAIN
producer.sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="$ConnectionString" password="Endpoint=sb://.servicebus.windows.net/;SharedAccessKeyName=RootManageSharedAccessKey;SharedAccessKey=**********";
consumer.security.protocol=SASL_SSL
consumer.sasl.mechanism=PLAIN
consumer.sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="$ConnectionString" password="Endpoint=sb://.servicebus.windows.net/;SharedAccessKeyName=RootManageSharedAccessKey;SharedAccessKey=**********";
========
The text was updated successfully, but these errors were encountered: