kafka-console-consumer.sh 疑似执行解码

kafka-console-consumer.sh suspected as performing decoding

我正在使用 kafka 安装文件夹中的 kafka-console-consumer.sh OOB 脚本。 在字节输入上尝试此脚本时,其中是一个十六进制代码值,我将结果作为原始值。

即 输入:68656c6c6f20776f726c64 脚本输出为:hello world

我怀疑这个 OOB 脚本解码了十六进制值。 任何人都可以 approve/deny 那?

谢谢!

这取决于您的应用程序如何将数据序列化到主题上。

kafka-console-consumer.sh 默认情况下 反序列化 使用 StringSerializer 的消息内容。

您还可以使用 kafkacat 之类的东西来检查消息的内容并使用各种选项反序列化它们:

 -s key=<serdes>    Deserialize non-NULL keys using <serdes>.
 -s value=<serdes>  Deserialize non-NULL values using <serdes>.
 -s <serdes>        Deserialize non-NULL keys and values using <serdes>.
                    Available deserializers (<serdes>):
                      <pack-str> - A combination of:
                                   <: little-endian,
                                   >: big-endian (recommended),
                                   b: signed 8-bit integer
                                   B: unsigned 8-bit integer
                                   h: signed 16-bit integer
                                   H: unsigned 16-bit integer
                                   i: signed 32-bit integer
                                   I: unsigned 32-bit integer
                                   q: signed 64-bit integer
                                   Q: unsigned 64-bit integer
                                   c: ASCII character
                                   s: remaining data is string
                                   $: match end-of-input (no more bytes remaining or a parse error is raised).
                                      Not including this token skips any
                                      remaining data after the pack-str is
                                      exhausted.
                      avro       - Avro-formatted with schema in Schema-Registry (requires -r)
                    E.g.: -s key=i -s value=avro - key is 32-bit integer, value is Avro.
                      or: -s avro - both key and value are Avro-serialized