Grok Parse Failure - Logger

Ok so the baseline of the log that I am trying to parse is the results of a command line speed test. I am using the logger command to export the results to my ELK server. I've done the leg work and gotten a pattern to work on both https://grokdebug.herokuapp.com/ and https://grokconstructor.appspot.com and my pattern works on those websites.

However, when the log actually hits my server, I get a Grok Parse failure. The message is listed below and the filter is below that.

<13>1 2017-12-07T07:04:02.720491-06:00 internal-web user - - [timeQuality tzKnown="1" isSynced="1" syncAccuracy="136000"] 2017-12-07 07:04:02	2017-12-07 07:04:30	Time Warner Cable	10.10.10.10	Grande Communications (San Marcos, TX)	94.31	26.775	336.03	23.38	http://www.speedtest.net/result/6858080753.png

%{SYSLOG5424LINE}%{SPACE}%{TIMESTAMP_ISO8601:Start}%{SPACE}%{TIMESTAMP_ISO8601:Stop}%{SPACE}%{GREEDYDATA:Provider}%{SPACE}%{IP:Local_IP}%{SPACE}%{GREEDYDATA:Server}%{BASE16FLOAT:Distance}   %{BASE16FLOAT:Ping}  %{BASE16FLOAT:Download}  %{BASE16FLOAT:Upload}   %{URI}

As far as I can tell, the issue is the %{SYSLOG5424LINE} portion of the filter. What confuses me is that this should be a default one for Logstash unless something changed in a recent update. I am not running the absolute most up to date version of the services but I am on Version 5.6.1 so I am not too far behind the most recent releases.

Am I missing something dramatic?

If you look at the definition of SYSLOG5424LINE you'll see that it includes the timestamp, hostname, etc so you shouldn't include them in your expression.

I should point out that the message I get is kind of two parts.

<13>1 2017-12-07T07:04:02.720491-06:00 internal-web user- - [timeQuality tzKnown="1" isSynced="1" syncAccuracy="136000"] 

This portion is added by the logger command and not part of my actual speed test. If I could get it transferred to my ELK stack without this portion, I would prefer it.

2017-12-07 07:04:02	2017-12-07 07:04:30	Time Warner Cable	10.10.10.10	Grande Communications (San Marcos, TX)	94.31	26.775	336.03	23.38	http://www.speedtest.net/result/6858080753.png

This is the output from my actual command and the only part that I care about, but I can't parse this without dealing with the first half.

Just to add some additional information for this, this is the main portion of my config files for logstash. What I don't have added is my input, PFSense, and output config files but those items are all workings as expected so I figure they are not required.

filter {
if [type] == "syslog" {
#pfSense ip address
if [host] =~ /10.10.10.10/ {
mutate {
add_tag => ["PFSense", "Ready"]
}
}
if [host] =~ /10.10.10.11/ {
mutate {
add_tag => ["SpeedTest","Ready"]
}
}
if "Ready" not in [tags] {
mutate {
add_tag => [ "syslog" ]
}
}
}
}
filter {
if [type] == "syslog" {
mutate {
remove_tag => "Ready"
}
}
}
filter {
if "syslog" in [tags] {
grok {
patterns_dir => "/etc/logstash/conf.d/patterns"
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:[%{POSINT:syslog_pid}])?: %{GREEDYDATA:syslog_message}" }
add_field => [ "received_at", "%{@timestamp}" ]
add_field => [ "received_from", "%{host}" ]
}
syslog_pri { }
date {
match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
locale => "en"
}
if !("_grokparsefailure" in [tags]) {
mutate {
replace => [ "@source_host", "%{syslog_hostname}" ]
replace => [ "@message", "%{syslog_message}" ]
}
}
mutate {
remove_field => [ "syslog_hostname", "syslog_message", "syslog_timestamp" ]
}
}
}
filter {
if "SpeedTest" in [tags] {
grok {
patterns_dir => "/etc/logstash/conf.d/patterns"
match => { "message" => "%{SYSLOG5424LINE}%{SPACE}%{TIMESTAMP_ISO8601:Start}%{SPACE}%{TIMESTAMP_ISO8601:Stop}%{SPACE}%{GREEDYDATA:Provider}%{SPACE}%{IP:Local_IP}%{SPACE}%{GREEDYDATA:Server}%{BASE16FLOAT:Distance} %{BASE16FLOAT:Ping} %{BASE16FLOAT:Download} %{BASE16FLOAT:Upload} %{URI}"}
}
}
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.