如何配置 kafka connect 使用 Avro Schema?
How to configure kafka connect use Avro Schema?
我开始学习Avro了。我想在kafka connect中实现它。我使用如下配置。这是正确的配置吗?
{
"name": "surveyWawancara-connector",
"config": {
"connector.class": "io.debezium.connector.sqlserver.SqlServerConnector",
"key.deserializer": "org.apache.kafka.connect.json.JsonDeserializer",
"database.user": "Acquisition.ro",
"database.dbname": "acquisition",
"value.deserializer": "org.apache.kafka.connect.json.JsonDeserializer",
"tasks.max": "1",
"key.converter": "io.confluent.connect.avro.AvroConverter",
"key.converter.schema.registry.url": "http://localhost:8081",
"value.converter": "io.confluent.connect.avro.AvroConverter",
"value.converter.schema.registry.url": "http://localhost:8081",
"database.history.kafka.bootstrap.servers": "beta-kafka-brokers.amq-streams-beta.svc:9092",
"database.history.kafka.topic": "schema-changes.sl.surveyWawancara",
"time.precision.mode": "connect",
"database.server.name": "beta-sl-bn",
"database.port": "1433",
"table.whitelist": "dbo.SurveyWawancara",
"key.converter.schemas.enable": "true",
"database.hostname": "10.7.76.62",
"database.password": "Acquisition_ro231!",
"value.converter.schemas.enable": "true",
"name": "surveyWawancara-connector",
"value.converter": "org.apache.kafka.connect.json.JsonConverter",
"key.converter": "org.apache.kafka.connect.json.JsonConverter"
}
}
您复制了转换器字段,但这些属性是正确的,是的
"key.converter": "io.confluent.connect.avro.AvroConverter",
"key.converter.schema.registry.url": "http://localhost:8081",
"value.converter": "io.confluent.connect.avro.AvroConverter",
"value.converter.schema.registry.url": "http://localhost:8081",
Avro 总是有一个架构,所以这些什么都不做 *.schemas.enable
并且可以删除。同样,反序列化器 class 配置不适用于 Connect,并且包含在转换器配置中,因此也应删除
值得一提的是,密钥格式不必(而且通常不会)匹配值的
我开始学习Avro了。我想在kafka connect中实现它。我使用如下配置。这是正确的配置吗?
{
"name": "surveyWawancara-connector",
"config": {
"connector.class": "io.debezium.connector.sqlserver.SqlServerConnector",
"key.deserializer": "org.apache.kafka.connect.json.JsonDeserializer",
"database.user": "Acquisition.ro",
"database.dbname": "acquisition",
"value.deserializer": "org.apache.kafka.connect.json.JsonDeserializer",
"tasks.max": "1",
"key.converter": "io.confluent.connect.avro.AvroConverter",
"key.converter.schema.registry.url": "http://localhost:8081",
"value.converter": "io.confluent.connect.avro.AvroConverter",
"value.converter.schema.registry.url": "http://localhost:8081",
"database.history.kafka.bootstrap.servers": "beta-kafka-brokers.amq-streams-beta.svc:9092",
"database.history.kafka.topic": "schema-changes.sl.surveyWawancara",
"time.precision.mode": "connect",
"database.server.name": "beta-sl-bn",
"database.port": "1433",
"table.whitelist": "dbo.SurveyWawancara",
"key.converter.schemas.enable": "true",
"database.hostname": "10.7.76.62",
"database.password": "Acquisition_ro231!",
"value.converter.schemas.enable": "true",
"name": "surveyWawancara-connector",
"value.converter": "org.apache.kafka.connect.json.JsonConverter",
"key.converter": "org.apache.kafka.connect.json.JsonConverter"
}
}
您复制了转换器字段,但这些属性是正确的,是的
"key.converter": "io.confluent.connect.avro.AvroConverter",
"key.converter.schema.registry.url": "http://localhost:8081",
"value.converter": "io.confluent.connect.avro.AvroConverter",
"value.converter.schema.registry.url": "http://localhost:8081",
Avro 总是有一个架构,所以这些什么都不做 *.schemas.enable
并且可以删除。同样,反序列化器 class 配置不适用于 Connect,并且包含在转换器配置中,因此也应删除
值得一提的是,密钥格式不必(而且通常不会)匹配值的