Logstash 仅发送 json
Logstash send only json
我正在通过 logstash 发送此日志
2017-02-27T13:00:07+01:00 test {"createdAt":"2017-02-27T13:00:07+0100","cluster":"undefined","nodeName":"undefined","nodeIP":"10.11.11.50","clientIP":"10.11.11.72","customerId":1,"identityId":332,"appType":"admin","eventGroup":"education","eventName":"insert","eventData":{"education_insert":{"type":"course","data":{"education_id":2055,"education":{"id":2055,"customer_id":1,"creator_id":332,"type":"course","status":"new","is_featured":false,"enroll_deadline":null,"complete_deadline":null,"count_view":0,"count_like":0,"meta_title":"test Course - progress","meta_description":"test Course - progress","discoverable":"everyone","progress_max":0,"instructor_ids":[332],"tag_ids":[135],"discoverable_group_ids":[],"category_ids":[14],"audits":null,"instructors":null,"creator":null,"lessonGroups":null,"categories":null},"duration":"quick"}}},"scopeType":"education","scopeId":"2055"}
如何删除 2017-02-27T13:00:07+01:00
和 test.app.event
您需要使用 grok
提取消息的 json 部分,然后使用 json
过滤器将提取的 json 转换为事件。最后,您需要使用 mutate
删除您在最终事件中不需要的任何字段(例如,message
)。
您可以使用 regex 模式来仅获得 json
。 regex
模式应该在您的 patterns 文件中:
模式可能如下所示:
REQUIREDDATA {([^}]*)}([^}]*)([^}]*)}}}([^}]*)} <-- this extracts only your json part
测试的正则表达式pattern。
此后您可以在 grok match:
中使用该模式
grok {
patterns_dir => ["/pathto/patterns"]
match => { "message" => "^%{REQUIREDDATA:new}" }
}
现在消息只有日志行中的 JSON
部分,因此您现在可以通过 logstash
将它们推送到 ES。以上只是一个示例,方便大家复现。
希望对您有所帮助!
我用过这个并为我工作 :) 感谢帮助
input { kafka { bootstrap_servers=>"localhost:9092" topics=>"test"}}
filter{
grok {
match => { "message" => "%{TIMESTAMP_ISO8601:timestamp}\t%{GREEDYDATA:topic}\t%{GREEDYDATA:json}" }
}
json {
source => "json"
remove_field => ["timestamp","topic","json","message","@version","@timestamp","tags"]
}
}
output{ elasticsearch {hosts=>["127.0.0.1:9200"] document_type=>"app_stats" index=>"test"}}
我正在通过 logstash 发送此日志
2017-02-27T13:00:07+01:00 test {"createdAt":"2017-02-27T13:00:07+0100","cluster":"undefined","nodeName":"undefined","nodeIP":"10.11.11.50","clientIP":"10.11.11.72","customerId":1,"identityId":332,"appType":"admin","eventGroup":"education","eventName":"insert","eventData":{"education_insert":{"type":"course","data":{"education_id":2055,"education":{"id":2055,"customer_id":1,"creator_id":332,"type":"course","status":"new","is_featured":false,"enroll_deadline":null,"complete_deadline":null,"count_view":0,"count_like":0,"meta_title":"test Course - progress","meta_description":"test Course - progress","discoverable":"everyone","progress_max":0,"instructor_ids":[332],"tag_ids":[135],"discoverable_group_ids":[],"category_ids":[14],"audits":null,"instructors":null,"creator":null,"lessonGroups":null,"categories":null},"duration":"quick"}}},"scopeType":"education","scopeId":"2055"}
如何删除 2017-02-27T13:00:07+01:00
和 test.app.event
您需要使用 grok
提取消息的 json 部分,然后使用 json
过滤器将提取的 json 转换为事件。最后,您需要使用 mutate
删除您在最终事件中不需要的任何字段(例如,message
)。
您可以使用 regex 模式来仅获得 json
。 regex
模式应该在您的 patterns 文件中:
模式可能如下所示:
REQUIREDDATA {([^}]*)}([^}]*)([^}]*)}}}([^}]*)} <-- this extracts only your json part
测试的正则表达式pattern。
此后您可以在 grok match:
中使用该模式grok {
patterns_dir => ["/pathto/patterns"]
match => { "message" => "^%{REQUIREDDATA:new}" }
}
现在消息只有日志行中的 JSON
部分,因此您现在可以通过 logstash
将它们推送到 ES。以上只是一个示例,方便大家复现。
希望对您有所帮助!
我用过这个并为我工作 :) 感谢帮助
input { kafka { bootstrap_servers=>"localhost:9092" topics=>"test"}}
filter{
grok {
match => { "message" => "%{TIMESTAMP_ISO8601:timestamp}\t%{GREEDYDATA:topic}\t%{GREEDYDATA:json}" }
}
json {
source => "json"
remove_field => ["timestamp","topic","json","message","@version","@timestamp","tags"]
}
}
output{ elasticsearch {hosts=>["127.0.0.1:9200"] document_type=>"app_stats" index=>"test"}}