Hello guys i wanna ask some question about CDC mysql to kafka with azure database for mysql. I already following this tutorial:
https://techcommunity.microsoft.com/t5/azure-database-for-mysql-blog/cdc-in-azure-database-for-mysql-flexible-server-using-kafka/ba-p/2780943
but got stuck while creating kafka conncetor on this part, and the error:
{"error_code":400,"message":"Connector configuration is invalid and contains the following 1 error(s):nUnable to connect: Communications link failurennThe last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.nYou can also find the above list of errors at the endpoint `/connector-plugins/{connectorType}/config/validate`"}
Kafka connector config:
{
"name": "sql-server-connection",
"config": {
"connector.class": "io.debezium.connector.mysql.MySqlConnector",
"database.hostname": "localhost",
"database.port": "3306",
"database.user": "soleluna",
"database.dbname": "cdcdatabase",
"database.password": "mypassword",
"database.server.id": "1",
"database.server.name": "userserver",
"table.whitelist": "dbo.users",
"database.history": "io.debezium.relational.history.MemoryDatabaseHistory",
"topic.prefix": "cdc.kafkadev"
}
}
connect-distributed.properties config:
bootstrap.servers=goldwing.servicebus.windows.net:9093
group.id=connect-cluster-group-1
# connect internal topic names, auto-created if not exists
config.storage.topic=connect-cluster-configs
offset.storage.topic=connect-cluster-offsets
status.storage.topic=connect-cluster-status
# internal topic replication factors - auto 3x replication in Azure Storage
config.storage.replication.factor=1
offset.storage.replication.factor=1
status.storage.replication.factor=1
rest.advertised.host.name=connect
offset.flush.interval.ms=10000
key.converter=org.apache.kafka.connect.json.JsonConverter
value.converter=org.apache.kafka.connect.json.JsonConverter
internal.key.converter=org.apache.kafka.connect.json.JsonConverter
internal.value.converter=org.apache.kafka.connect.json.JsonConverter
internal.key.converter.schemas.enable=false
internal.value.converter.schemas.enable=false
# required EH Kafka security settings
security.protocol=SASL_SSL
sasl.mechanism=PLAIN
sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="$ConnectionString" password="Endpoint=sb://goldwing.servicebus.windows.net/;SharedAccessKeyName=RootManageSharedAccessKey;SharedAccessKey=********";
producer.security.protocol=SASL_SSL
producer.sasl.mechanism=PLAIN
producer.sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="$ConnectionString" password="Endpoint=sb://goldwing.servicebus.windows.net/;SharedAccessKeyName=RootManageSharedAccessKey;SharedAccessKey=********";
consumer.security.protocol=SASL_SSL
consumer.sasl.mechanism=PLAIN
consumer.sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="$ConnectionString" password="Endpoint=sb://goldwing.servicebus.windows.net/;SharedAccessKeyName=RootManageSharedAccessKey;SharedAccessKey=********";
plugin.path=/opt/kafka/libs
The connect-cluster-configs, connect-cluster-offsets, connect-cluster-status topic is already automatically created on azure namespace, can anyone show me where’s my mistake? any answer will be appreciate thank you.
My expectation is the debezium kafka connector topic are created and can replicate the data from azure mysql to kafka topic (event hub).
2
Answers
After few exploration and trial and error, i already solved that issue. The issue is while setting the TLS on the Azure Eventhub. Before i used TLS Version 1.2 (latest), and after i change to TLS Version 1.0 works fine no problem, the default topic (event hub) will automatically generated on Namespace.
https://i.sstatic.net/Z4wihkLm.png
@Dityudha without more, logs this may be difficult to solve. Please check the following and post more logs
Check if the CDC is enabled on the tables
Check the Kafka connect logs.
Do you have any other hints on the logs ?
Are there issues in the permissions used for MySQL server , are there any logs there ?
Any network issues between where the connector is deployed and the My SQL Server ?