Clickhouse Kafka 引擎抛出异常

Clickhouse Kafka Engine Throwing Exception

我正在尝试使用 Clickhouse Kafka 引擎来摄取数据。数据为 CSV 格式。在数据摄取期间,有时我会遇到异常

2018.01.08 08:41:47.016826 [ 3499 ] <Debug> StorageKafka (consumer_queue): Started streaming to 1 attached views
2018.01.08 08:41:47.016906 [ 3499 ] <Trace> StorageKafka (consumer_queue): Creating formatted reader
2018.01.08 08:41:49.680816 [ 3499 ] <Error> void DB::StorageKafka::streamThread(): Code: 117, e.displayText() = DB::Exception: Expected end of line, e.what() = DB::Exception, Stack trace:

0. clickhouse-server(StackTrace::StackTrace()+0x16) [0x3221296]
1. clickhouse-server(DB::Exception::Exception(std::string const&, int)+0x1f) [0x144a02f]
2. clickhouse-server() [0x36e6ce1]
3. clickhouse-server(DB::CSVRowInputStream::read(DB::Block&)+0x1a0) [0x36e6f60]
4. clickhouse-server(DB::BlockInputStreamFromRowInputStream::readImpl()+0x64) [0x36e3454]
5. clickhouse-server(DB::IProfilingBlockInputStream::read()+0x16e) [0x2bcae0e]
6. clickhouse-server(DB::KafkaBlockInputStream::readImpl()+0x6c) [0x32f6e7c]
7. clickhouse-server(DB::IProfilingBlockInputStream::read()+0x16e) [0x2bcae0e]
8. clickhouse-server(DB::copyData(DB::IBlockInputStream&, DB::IBlockOutputStream&, std::atomic<bool>*)+0x55) [0x35b3e25]
9. clickhouse-server(DB::StorageKafka::streamToViews()+0x366) [0x32f54f6]
10. clickhouse-server(DB::StorageKafka::streamThread()+0x143) [0x32f58c3]
11. clickhouse-server() [0x40983df]
12. /lib/x86_64-linux-gnu/libpthread.so.0(+0x76ba) [0x7f4d115d06ba]
13. /lib/x86_64-linux-gnu/libc.so.6(clone+0x6d) [0x7f4d10bf13dd]

下面是table

CREATE TABLE test.consumer_queue (ID Int32,  DAY Date) ENGINE = Kafka('broker-ip:port', 'clickhouse-kyt-test','clickhouse-kyt-test-group', '**CSV**')

CREATE TABLE test.consumer_request ( ID Int32,  DAY Date) ENGINE = MergeTree PARTITION BY DAY ORDER BY (DAY, ID) SETTINGS index_granularity = 8192

CREATE MATERIALIZED VIEW test.consumer_view TO test.consumer_request (ID Int32, DAY Date) AS SELECT ID, DAY FROM test.consumer_queue

CSV 数据

10034,"2018-01-05"
10035,"2018-01-05"
10036,"2018-01-05"
10037,"2018-01-05"
10038,"2018-01-05"
10039,"2018-01-05"

Clickhouse 服务器版本 1.1.54318.

ClickHouse 似乎从 Kafka 读取了一批消息,然后尝试将所有这些消息解码为单个 CSV。 并且此单个 CSV 中的消息应以换行符分隔。 所以所有消息都应该在末尾有换行符。

我不确定这是 ClickHouse 的功能还是错误。

您可以尝试只向kafka发送一条消息,并检查它是否正确显示在ClickHouse中。

如果您使用脚本 kafka-console-producer.sh 向 Kafka 发送消息,那么此脚本 (class ConsoleProducer.scala) 从文件中读取行并将每一行发送到 Kafka 主题没有换行符,无法正确处理此类消息。

如果您使用自己的 script/application 发送消息,那么您可以尝试修改它并在每条消息的末尾添加换行符。这应该可以解决问题。 或者您可以为 Kafka 引擎使用其他格式,例如 JSONEachRow。

同意 @mikhail 's answer, i guess, try kafka_row_delimiter = '\n' 在 SETTINGS KAFKA 引擎