Logstash Error: agent - Failed to execute action
Logstash Error: agent - Failed to execute action
我一直在关注 tutorial 如何将 ELK 堆栈用于 nginx 日志。
我创建了 nginx.conf 来配置如何获取日志,但是当我键入:bin/logstash -f /etc/logstash/conf.d/nginx.conf
我收到这个错误:
[ERROR] 2020-11-13 14:59:15.254 [Converge
PipelineAction::Create] agent - Failed to execute action
{:action=>LogStash::PipelineAction::Create/pipeline_id:main,
:exception=>"LogStash::ConfigurationError", :message=>"Expected one of
[A-Za-z0-9_-], [ \t\r\n], "#", "=>" at line 9, column 8 (byte
135) after input{\n\t\n file{\n path =>
["/var/log/nginx/access.log" , "/var/log/nginx/error.log"]\n
type => "nginx"\n }\n filter{\n \n grok",
:backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:32:in
compile_imperative'", "org/logstash/execution/AbstractPipelineExt.java:184:in
initialize'",
"org/logstash/execution/JavaBasePipelineExt.java:69:in initialize'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:47:in
initialize'",
"/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:52:in
execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:365:in
block
in converge_state'"]}
and here's my nginx.conf file:
input{
file{
path => ["/var/log/nginx/access.log" , "/var/log/nginx/error.log"]
type => "nginx"
}
filter{
grok{
match => ["message" , "%{COMBINEDAPACHELOG}+%{GREEDYDATA:extra_fields}"]
overwrite => ["message"]
}
mutate{
convert => ["response","integer"]
convert => ["bytes","integer"]
convert => ["responsetime","float"]
}
geoip{
source => "clientip"
target => "geoip"
add_tag => ["nginx-geoip"]
}
date {
match ⁼> ["timestamp" , "dd/MMM/YYYY:HH:mm:ss Z"]
remove_field => ["timestamp"]
}
useragent {
source => "agent"
}
}
output{
elasticsearch {
hosts => ["localhost:9200"]
index => "nginx-%{+yyyy.MM.dd}"
document_type => "nginx_logs"
}
}
}
我找到了类似的 question 但答案没有帮助。
有没有人熟悉logstash语法并帮助找出我的错误
谢谢
您缺少一个 } 来关闭输入部分。将其插入过滤器关键字之前。
此外,删除文件中的最后一个 }。
我一直在关注 tutorial 如何将 ELK 堆栈用于 nginx 日志。
我创建了 nginx.conf 来配置如何获取日志,但是当我键入:bin/logstash -f /etc/logstash/conf.d/nginx.conf
我收到这个错误:
[ERROR] 2020-11-13 14:59:15.254 [Converge PipelineAction::Create] agent - Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of [A-Za-z0-9_-], [ \t\r\n], "#", "=>" at line 9, column 8 (byte 135) after input{\n\t\n file{\n path => ["/var/log/nginx/access.log" , "/var/log/nginx/error.log"]\n
type => "nginx"\n }\n filter{\n \n grok", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:32:incompile_imperative'", "org/logstash/execution/AbstractPipelineExt.java:184:in
initialize'", "org/logstash/execution/JavaBasePipelineExt.java:69:ininitialize'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:47:in
initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:52:inexecute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:365:in
block in converge_state'"]} and here's my nginx.conf file:
input{
file{
path => ["/var/log/nginx/access.log" , "/var/log/nginx/error.log"]
type => "nginx"
}
filter{
grok{
match => ["message" , "%{COMBINEDAPACHELOG}+%{GREEDYDATA:extra_fields}"]
overwrite => ["message"]
}
mutate{
convert => ["response","integer"]
convert => ["bytes","integer"]
convert => ["responsetime","float"]
}
geoip{
source => "clientip"
target => "geoip"
add_tag => ["nginx-geoip"]
}
date {
match ⁼> ["timestamp" , "dd/MMM/YYYY:HH:mm:ss Z"]
remove_field => ["timestamp"]
}
useragent {
source => "agent"
}
}
output{
elasticsearch {
hosts => ["localhost:9200"]
index => "nginx-%{+yyyy.MM.dd}"
document_type => "nginx_logs"
}
}
}
我找到了类似的 question 但答案没有帮助。 有没有人熟悉logstash语法并帮助找出我的错误
谢谢
您缺少一个 } 来关闭输入部分。将其插入过滤器关键字之前。
此外,删除文件中的最后一个 }。