Logstash JSON 序列化在有效 JSON (mapper_parsing_exception) 上失败

Logstash JSON serialization fails on valid JSON (mapper_parsing_exception)

给定以下多行日志

{
  "code" : 429
}

以及下面的管道logstash.conf

filter { 
    grok {
        match => 
        {
            "message" => 
                [
                    "%{GREEDYDATA:json}"
                ]
        }
    }

    json {
        source => "json"
        target => "json"
    }
}

Log通过filebeat发送到logstash中

然后 Logstash returns

[2018-08-07T10:48:41,067][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"filebeat-to-logstash", :_type=>"doc", :_routing=>nil}, #<LogStash::Event:0x2bf7b08d>], :response=>{"index"=>{"_index"=>"filebeat-to-logstash", "_type"=>"doc", "_id"=>"trAAFGUBnhQ5nUWmyzVg", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [json]", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:3846"}}}}}

这是不正确的行为,因为 JSON 是完全有效的,应该如何解决?

我发现在 Logstash 6.3.0 中,当有人试图在 "json" 字段上序列化 JSON 时会出现此问题。将此字段名称更改为任何其他名称可解决此问题。

由于 Elastic JSON filter plugin documentation 没有提及任何有关此行为的信息并且错误消息不准确,因此可以假定这是一个错误。

错误报告已发送:https://github.com/elastic/logstash/issues/9876