在 Kafka 中批量处理大量数据?
Batching large amounts of data in Kafka?
我正在使用以下配置我的消费者
几乎直接来自 Confluent 文档。我担心的是,我传入了字节数 (500mb) 并且长度超过了整数的大小。可能是一个愚蠢的问题,但我可以配置消费者属性来接受 long 或什么吗?由于值太大而出现错误。
文档看起来没有限制,但类型是 int
所以不确定这怎么可能:
max.partition.fetch.bytes: The maximum amount of data per-partition the server will return. Records are fetched in batches by the consumer. If the first record batch in the first non-empty partition of the fetch is larger than this limit, the batch will still be returned to ensure that the consumer can make progress. The maximum record batch size accepted by the broker is defined via message.max.bytes (broker config) or max.message.bytes (topic config). See fetch.max.bytes for limiting the consumer request size.
Type: int
Default: 1048576
Valid Values: [0,...]
Importance: high
我正在使用以下配置我的消费者
几乎直接来自 Confluent 文档。我担心的是,我传入了字节数 (500mb) 并且长度超过了整数的大小。可能是一个愚蠢的问题,但我可以配置消费者属性来接受 long 或什么吗?由于值太大而出现错误。
文档看起来没有限制,但类型是 int
所以不确定这怎么可能:
max.partition.fetch.bytes: The maximum amount of data per-partition the server will return. Records are fetched in batches by the consumer. If the first record batch in the first non-empty partition of the fetch is larger than this limit, the batch will still be returned to ensure that the consumer can make progress. The maximum record batch size accepted by the broker is defined via message.max.bytes (broker config) or max.message.bytes (topic config). See fetch.max.bytes for limiting the consumer request size.
Type: int
Default: 1048576
Valid Values: [0,...]
Importance: high