Need Help Parsing Nginx SSL Requests With Grok

Hello all,

I'm seing "_grokparsefailure" and "_geoip_lookup_failure" errors when Logstash attempts to parse nginx ssl requests. Like for instance when parsing the following:

xxx.xxx.xxx.xxx - - [03/Apr/2019:21:05:55 +0000]TLSv1.2/ECDHE-RSA-AES256-GCM-SHA384"POST /login_check HTTP/1.1" 302 5787"https://xxxx.xxxxx.com/" "Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/73.0.3683.86 Safari/537.36"

My logstash config looks like below

input {
beats {
port => 5044
}
}

filter {
if [fileset][module] == "nginx" {
if [fileset][name] == "access" {
grok {
match => [ "message" , "%{COMBINEDAPACHELOG}+%{GREEDYDATA:extra_fields}"]
overwrite => [ "message" ]
}
mutate {
convert => ["response", "integer"]
convert => ["bytes", "integer"]
convert => ["responsetime", "float"]
}
if [clientip] != "127.0.0.1" and [clientip] !~ /^10./ {
geoip {
source => "clientip"
target => "geoip"
add_tag => [ "nginx-geoip" ]
}
}
date {
match => [ "timestamp" , "dd/MMM/YYYY:HH:mm:ss Z" ]
remove_field => [ "timestamp" ]
}
useragent {
source => "agent"
}
}
if [fileset][name] == "error" {
grok {
match => { "message" => ["%{DATA:[nginx][error][time]} [%{DATA:[nginx][error][level]}] %{NUMBER:[nginx][error][pid]}#%{NUMBER:[nginx][error][tid]}: (*%{NUMBER:[nginx][error][connection_id]} )?%{GREEDYDATA:[nginx][error][message]}"] }
remove_field => "message"
}
mutate {
rename => { "@timestamp" => "read_timestamp" }
}
date {
match => [ "[nginx][error][time]", "YYYY/MM/dd H:m:s" ]
remove_field => "[nginx][error][time]"
}
}
}
}

output {
elasticsearch {
hosts => ["localhost:9200"]
manage_template => "false"
index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
}
}

I'm n00b at this so any insight or help will be greatly appreciated

I Was able to come up with the following grok pattern that works fine in the grok debugger but when added, the logstash service is unable to start. Grateful for any help.

%{IPV4:clientip} %{USERNAME:ident} %{USERNAME:auth} [%{HTTPDATE:timestamp}]%{NOTSPACE:tlsversion}/%{NOTSPACE:cryptoalgorithm}"(?:%{WORD:verb} %{NOTSPACE:request}(?: HTTP/%{BASE10NUM:httpversion}))" %{BASE10NUM:response} (?:%{BASE10NUM:bytes}|-)"(?:%{URI:referrer}|-)" "%{GREEDYDATA:agent}"

WARNING: Could not find logstash.yml which is typically located in $LS_HOME/config or /etc/logstash. You can specify the path using --path.settings. Continuing using the defaults
Could not find log4j2 configuration at path /usr/share/logstash/config/log4j2.properties. Using default config which logs errors to the console
[WARN ] 2019-04-05 16:06:33.738 [LogStash::Runner] multilocal - Ignoring the 'pipelines.yml' file because modules or command line options are specified
[INFO ] 2019-04-05 16:06:33.753 [LogStash::Runner] runner - Starting Logstash {"logstash.version"=>"6.6.2"}
[ERROR] 2019-04-05 16:06:35.306 [Converge PipelineAction::Create] agent - Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, {, ,, ] at line 15, column 161 (byte 437) after filter {\n if [fileset][module] == "nginx" {\n if [fileset][name] == "access" {\n grok {\n match => [ "message" , "%{COMBINEDAPACHELOG}+%{GREEDYDATA:extra_fields}" ]\n overwrite => [ "message" ]\n }\n grok {\n match => [ "message" , "%{IPV4:clientip} %{USERNAME:ident} %{USERNAME:auth} \[%{HTTPDATE:timestamp}\]%{NOTSPACE:tlsversion}/%{NOTSPACE:cryptoalgorithm}"", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:41:in compile_imperative'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:49:incompile_graph'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:11:in block in compile_sources'", "org/jruby/RubyArray.java:2486:inmap'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:10:in compile_sources'", "org/logstash/execution/AbstractPipelineExt.java:149:ininitialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:22:in initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:90:ininitialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:43:in block in execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:94:inblock in exclusive'", "org/jruby/ext/thread/Mutex.java:148:in synchronize'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:94:inexclusive'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:39:in execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:327:inblock in converge_state'"]}
[INFO ] 2019-04-05 16:06:35.560 [Api Webserver] agent - Successfully started Logstash API endpoint {:port=>9600}

Either escape the double quotes in your pattern or use single quotes

grok { match => [ "message", 'pattern contain "quoted string"' ] }
1 Like

Thank you so much for the tip. I was suspecting some kind of escape issue. Doing as you indicated solved the issue. Thanks again.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.