在 logstash 中使用 grok 模式解析我的 json 文件?

Parsing my json file by using grok pattern in logstash?

我正在尝试使用 logstash 将 json 文件解析为 elasticsearch 但我不能,我想我需要编写一些 grok 模式。但我做不到。如何使用 logstash 将以下 json 发送到 elasticsearch。

{"machinename":"test1",

"longdate":"2019-01-29 13:19:32",

"level":"Error",

"mysite":"test1",

"message":"test2",

"exception":"test3",

"timestamp":"2019-01-29T13:19:32.257Z" }

我的 logstash 文件:


input {
  file {
       path => ["P:/logs/*.txt"]
        start_position => "beginning" 
        discover_interval => 10
        stat_interval => 10
        sincedb_write_interval => 10
        close_older => 10
       codec => multiline { 
        negate => true
        what => "previous" 
       }
  }
}

filter {  
 date {
            match => ["TimeStamp", "ISO8601"]
             }  
    json{
        source => "request"
        target => "parsedJson"

    }   

}   

output {  

    stdout {
        codec => rubydebug
    }



    elasticsearch {
        hosts => [ "http://localhost:9200" ]
         index => "log-%{+YYYY.MM}"

    }   
}



错误:

[2019-01-29T14:30:54,907][WARN][logstash.config.source.multilocal] 忽略 'pipelines.yml' 文件,因为指定了模块或命令行选项 [2019-01-29T14:30:56,929][信息][logstash.runner] 启动 Logstash {"logstash.version"=>"6.3.2"} [2019-01-29T14:30:59,167][错误][logstash.agent] 无法执行操作 {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, {, } at line 12, column 18 (byte 281) after input {\n file {\n\t path => [\"P:/logs/*.txt\"]\n\t\tstart_position => \"beginning\" \n\t\tdiscover_interval => 10\n\t\tstat_interval = > 10\n\t\tsincedb_write_interval => 10\n\t\tclose_older => 10\n 编解码器 => 多行 { \n\t\tpattern => \"^%{TIMESTAMP_ISO8601}\\"\n\t\tnegate => true\n 什么 => \"", :backtrace=>["P:/elk/logstash/logstash-core/lib/logstash/compiler.rb:42:in compile_imperative'", "P:/elk/logstash/logstash-core/lib/logstash/compiler.rb:50:incompile_graph'", "P:/elk/logstash/logstash-core/lib/logstash/compiler.rb:12:in block in compile_sources'", "org/jruby/RubyArray.java:2486:inmap'", "P:/elk/logstash/logstash-core/lib/logstash/compiler.rb:11:in compile_sources'", "P:/elk/logstash/logstash-core/lib/logstash/pipeline.rb:49:ininitialize'", "P:/elk/logstash/logstash-core/lib/logstash/pipeline.rb:167:in initialize'", "P:/elk/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:40:inexecute'", "P:/elk/logstash/logstash-core/lib/logstash/agent.rb:305:in `block in converge_state'"] } [2019-01-29T14:31:00,417][INFO][logstash.agent] 成功启动 Logstash API 端点 {:port=>9600} [2019-01-29T14:34:23,554][警告][logstash.config.source.multilocal]忽略'pipelines.yml'文件,因为指定了模块或命令行选项 [2019-01-29T14:34:24,554][信息][logstash.runner] 启动 Logstash {"logstash.version"=>"6.3.2"} [2019-01-29T14:34:27,486][错误][logstash.codecs.multiline]缺少多行编解码器插件所需的设置:

编解码器{ 多行{ pattern => # 设置丢失 ... } } [2019-01-29T14:34:27,502][错误][logstash.agent] 无法执行操作 {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Something is wrong with your configuration.", :backtrace=>["P:/elk/logstash/logstash-core/lib/logstash/config/mixin.rb:89:in config_init'", "P:/elk/logstash/logstash-core/lib/logstash/codecs/base.rb:19:ininitialize'", "P:/elk/logstash/logstash-core/lib/logstash/plugins/plugin_factory.rb:97:in plugin'", "P:/elk/logstash/logstash-core/lib/logstash/pipeline.rb:110:inplugin'", "(eval):8:in <eval>'", "org/jruby/RubyKernel.java:994:ineval'", "P:/elk/logstash/logstash-core/lib/logstash/pipeline.rb:82:in initialize'", "P:/elk/logstash/logstash-core/lib/logstash/pipeline.rb:167:ininitialize'", "P:/elk/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:40:in execute'", "P:/elk/logstash/logstash-core/lib/logstash/agent.rb:305:inblock in converge_state'"]} [2019-01-29T14:34:27,971][INFO][logstash.agent] 成功启动 Logstash API 端点 {:port=>9600}

您可以尝试使用 json filter plugin 作为 logstash。

这样 logstash 中的过滤器插件将解析 json:

filter {
  json {
    source => "message"
  }
}

另一件好事是 tag_on_failure。这样,如果 json 无效或被误解,您将在 elasticsearch/kibana 中看到消息,但带有 _jsonparsefailure 标记。

  filter {
      json {
        source => "message"
        tag_on_failure => [ "_jsonparsefailure" ]
      }
    }