无法从 NiFi 发送到 Kafka
Can't send to Kafka from NiFi
我 运行 在 Docker Windows,这是我的 NiFi 设置:
关于 PublishKafka 处理器的详细信息:
ConsumeKafka 处理器详情:
这是我的docker-compose文件(注意:192.168.1.50是我的静态内部主机IP):
version: '3'
services:
Jenkins:
container_name: Jenkins
restart: on-failure
depends_on:
- NiFi
image: jenkins:latest
ports:
- "32779:50000"
- "32780:8080"
NiFi:
container_name: NiFi
image: xemuliam/nifi:latest
restart: on-failure
depends_on:
- kafka
ports:
- "32784:8089"
- "32783:8080"
- "32782:8081"
- "32781:8443"
labels:
com.foo: myLabel
zookeeper:
container_name: Zookeeper
image: wurstmeister/zookeeper
restart: on-failure
#network_mode: host
ports:
- "2181:2181"
kafka:
#container_name: Kafka
image: wurstmeister/kafka
depends_on:
- zookeeper
#restart: on-failure
#network_mode: host
ports:
- "9092"
environment:
#KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://192.168.1.50:9092
#KAFKA_AUTO_CREATE_TOPICS_ENABLE: "true"
KAFKA_CREATE_TOPICS: "MainIngestionTopic:1:1"
KAFKA_ZOOKEEPER_CONNECT: 192.168.1.50:2181
KAFKA_ADVERTISED_LISTENERS: INSIDE://:9092,OUTSIDE://192.168.1.50:9094
KAFKA_LISTENERS: INSIDE://:9092,OUTSIDE://:9094
KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: INSIDE:PLAINTEXT,OUTSIDE:PLAINTEXT
KAFKA_INTER_BROKER_LISTENER_NAME: INSIDE
volumes:
- ./var/run/docker.sock:/var/run/docker.sock
当我跟踪 Kafka 容器日志时,我可以看到我的主题已从 docker-compose 成功创建。
消息成功传递到NiFi中的PublishKafka处理器,但随后发布失败。订阅了相同主题的ConsumeKafka处理器一直没有收到消息
NiFi 容器日志显示如下:
2018-05-28 19:46:18,792 ERROR [Timer-Driven Process Thread-1] o.a.n.p.kafka.pubsub.PublishKafka PublishKafka[id=b2503f49-acc9-38f5-86f9-5029e2768b68] Failed to send all message for StandardFlowFileRecord[uuid=b3f6f818-34d3-42a9-9d6e-636cf17eb138,claim=StandardContentClaim [resourceClaim=StandardResourceClaim[id=1527533792820-1, container=default, section=1], offset=5, length=5],offset=0,name=8151630985100,size=5] to Kafka; routing to failure due to org.apache.kafka.common.errors.TimeoutException: Failed to update metadata after 5000 ms.: org.apache.kafka.common.errors.TimeoutException: Failed to update metadata after 5000 ms.
org.apache.kafka.common.errors.TimeoutException: Failed to update metadata after 5000 ms.
2018-05-28 19:46:18,792 INFO [Timer-Driven Process Thread-1] o.a.kafka.clients.producer.KafkaProducer Closing the Kafka producer with timeoutMillis = 5000 ms.
我尝试从 Kafka 容器内部发布到主题,但也失败了:
我梳理了文档并阅读了许多试图解决此问题的线程,但它仍然是一个问题。任何帮助将不胜感激!
您不能在 NiFi 的 "Kafka Brokers" 属性 中使用本地主机,除非代理实际上 运行 在与 NiFi 运行 相同的主机上。由于每个服务都在 docker 容器中,因此 kafka 的容器必须具有可以使用的特定主机名或 ip。
我 运行 在 Docker Windows,这是我的 NiFi 设置:
关于 PublishKafka 处理器的详细信息:
ConsumeKafka 处理器详情:
这是我的docker-compose文件(注意:192.168.1.50是我的静态内部主机IP):
version: '3'
services:
Jenkins:
container_name: Jenkins
restart: on-failure
depends_on:
- NiFi
image: jenkins:latest
ports:
- "32779:50000"
- "32780:8080"
NiFi:
container_name: NiFi
image: xemuliam/nifi:latest
restart: on-failure
depends_on:
- kafka
ports:
- "32784:8089"
- "32783:8080"
- "32782:8081"
- "32781:8443"
labels:
com.foo: myLabel
zookeeper:
container_name: Zookeeper
image: wurstmeister/zookeeper
restart: on-failure
#network_mode: host
ports:
- "2181:2181"
kafka:
#container_name: Kafka
image: wurstmeister/kafka
depends_on:
- zookeeper
#restart: on-failure
#network_mode: host
ports:
- "9092"
environment:
#KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://192.168.1.50:9092
#KAFKA_AUTO_CREATE_TOPICS_ENABLE: "true"
KAFKA_CREATE_TOPICS: "MainIngestionTopic:1:1"
KAFKA_ZOOKEEPER_CONNECT: 192.168.1.50:2181
KAFKA_ADVERTISED_LISTENERS: INSIDE://:9092,OUTSIDE://192.168.1.50:9094
KAFKA_LISTENERS: INSIDE://:9092,OUTSIDE://:9094
KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: INSIDE:PLAINTEXT,OUTSIDE:PLAINTEXT
KAFKA_INTER_BROKER_LISTENER_NAME: INSIDE
volumes:
- ./var/run/docker.sock:/var/run/docker.sock
当我跟踪 Kafka 容器日志时,我可以看到我的主题已从 docker-compose 成功创建。
消息成功传递到NiFi中的PublishKafka处理器,但随后发布失败。订阅了相同主题的ConsumeKafka处理器一直没有收到消息
NiFi 容器日志显示如下:
2018-05-28 19:46:18,792 ERROR [Timer-Driven Process Thread-1] o.a.n.p.kafka.pubsub.PublishKafka PublishKafka[id=b2503f49-acc9-38f5-86f9-5029e2768b68] Failed to send all message for StandardFlowFileRecord[uuid=b3f6f818-34d3-42a9-9d6e-636cf17eb138,claim=StandardContentClaim [resourceClaim=StandardResourceClaim[id=1527533792820-1, container=default, section=1], offset=5, length=5],offset=0,name=8151630985100,size=5] to Kafka; routing to failure due to org.apache.kafka.common.errors.TimeoutException: Failed to update metadata after 5000 ms.: org.apache.kafka.common.errors.TimeoutException: Failed to update metadata after 5000 ms.
org.apache.kafka.common.errors.TimeoutException: Failed to update metadata after 5000 ms.
2018-05-28 19:46:18,792 INFO [Timer-Driven Process Thread-1] o.a.kafka.clients.producer.KafkaProducer Closing the Kafka producer with timeoutMillis = 5000 ms.
我尝试从 Kafka 容器内部发布到主题,但也失败了:
我梳理了文档并阅读了许多试图解决此问题的线程,但它仍然是一个问题。任何帮助将不胜感激!
您不能在 NiFi 的 "Kafka Brokers" 属性 中使用本地主机,除非代理实际上 运行 在与 NiFi 运行 相同的主机上。由于每个服务都在 docker 容器中,因此 kafka 的容器必须具有可以使用的特定主机名或 ip。