Hi
I have checked the logstash config test. it's show there is no issue in config file but i have received the logstash log "Pipeline aborted due to error"
Config test log:
/usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/ffmpeglogs.conf --config.test_and_exit
WARNING: Could not find logstash.yml which is typically located in $LS_HOME/config or /etc/logstash. You can specify the path using --path.settings. Continuing using the defaults
Could not find log4j2 configuration at path /usr/share/logstash/config/log4j2.properties. Using default config which logs errors to the console
[WARN ] 2019-12-26 12:11:13.389 [LogStash::Runner] multilocal - Ignoring the 'pipelines.yml' file because modules or command line options are specified
Configuration OK
[INFO ] 2019-12-26 12:11:16.182 [LogStash::Runner] runner - Using config.test_and_exit mode. Config Validation Result: OK. Exiting Logstash
Logstash Error log:
[2019-12-26T12:26:58,649][ERROR][logstash.pipeline ] Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#<Grok::PatternError: pattern %{NGINX_ACCESS} not defined>, :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/jls-grok-0.11.5/lib/grok-pure.rb:123:in block in compile'", "org/jruby/RubyKernel.java:1292:in
loop'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/jls-grok-0.11.5/lib/grok-pure.rb:93:in compile'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-grok-4.0.3/lib/logstash/filters/grok.rb:281:in
block in register'", "org/jruby/RubyArray.java:1734:in each'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-grok-4.0.3/lib/logstash/filters/grok.rb:275:in
block in register'", "org/jruby/RubyHash.java:1343:in each'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-grok-4.0.3/lib/logstash/filters/grok.rb:270:in
register'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:340:in register_plugin'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:351:in
block in register_plugins'", "org/jruby/RubyArray.java:1734:in each'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:351:in
register_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:729:in maybe_setup_out_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:361:in
start_workers'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:288:in run'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:248:in
block in start'"], :thread=>"#<Thread:0x25500908 run>"}
My Grok pattern:
input {
kafka {
bootstrap_servers =>["localhost:9092"]
topics => ["ffmpeglog"]
codec => "json"
add_field => { "logz" => "ffmpeglogz" }
}
}
filter {
if [logz] == "ffmpeglogz" {
grok {
patterns_dir => "/etc/logstash/patterns"
match => { "message" => "%{NGINX_ACCESS}" }
remove_tag => ["_grokparsefailure"]
add_tag => ["ffmpeg_access"]
remove_field => [ message ]
}
if [response] == "101" {
date {
match => ["timestamp", "dd/MMM/yyyy:HH:mm:ss Z"]
target => "@timestamp"
}
geoip {
source => "clientip"
target => "geoip"
database => "/usr/share/GeoIP/GeoLite2-City.mmdb"
}
geoip {
source => "clientip"
target => "geoip"
database => "/usr/share/GeoIP/GeoLite2-ASN.mmdb"
add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}" ]
}
mutate {
convert => [ "[geoip][coordinates]", "float"]
remove_tag => [ "_geoip_lookup_failure" ]
remove_field => [ "@version", "auth", "offset", "verb" ]
}
}
else {
drop {}
}
}
}
output {
if [logz] == "ffmpeglogz" {
elasticsearch {
hosts => ["51.78.182.102:9200"]
index => "ffmpeglogs-%{+YYYY.MM.dd}"
}
file {
path => "/var/log/logstash/ffmpeglogs.log"
}
}
}
Could you please help me.