Grok parsing errors

Hi,

I've been working with setting up Logstash to filter logs from our firewall and inject them into Elasticsearch. We already have other inputs up and running with success although this particular attempt is more complicated than what we've done up until this point.
The main issue is that, having an followed an (albeit out of date), guide to monitoring our Palo Alto FW, I've had to make an amount of adjustments to the original config file that was provided online. The problem was that I was getting grok parse errors on the original config file so, I re-wrote the filter using Grok Debugger - http://grokdebug.herokuapp.com - and managed to get the output I was expecting rather than just no output at all.

The start of the config file aims to strip out the headers of the syslog file leaving only the raw message. I used a free Syslogger running on my local PC and pointed the FW logs at it so that I could see exactly what I was getting from the FW so as to figure out what was failing. The syslog I am getting looked like this:

Thu May 18 18:16:54 2017 192.168.0.1 <14>May 18 18:16:54 FWHOST.domain.com 1,2017/05/18 18:16:54,002201003352,TRAFFIC,end,1,2017/05/18 18:16:54,192.168.0.5,8.8.8.8,46.34.56.78.90,138.45.67.45,Firewall Policy,domain\user.surname,,ssl,vsys1,Business Unit,untrust,ae1.196,ethernet1/1,Logstash,2017/05/18 18:16:54,34191024,1,58894,443,1089,443,0x40001b,tcp,allow,1786,1235,551,13,2017/05/18 18:16:27,25,computer-and-internet-info,0,2577674231,0x0,10.0.0.0-10.255.255.255,GB,0,6,7,tcp-rst-from-server,15,43,0,0,,FW-HOST,from-policy

I'm looking to strip out the header (Thu May 18 18:16:54 2017 192.168.0.1 <14>May 18 18:16:54 FWHOST.domain.com 1) and to do this successfully in Grok Debugger, I've used the following:

%{DAY} %{MONTH} %{MONTHDAY} %{TIME} %{YEAR} %{IPV4} %{DATA} %{MONTHDAY} %{TIME} %{HOSTNAME} %{BASE10NUM},%{GREEDYDATA}

The actual Grok Parse error I get is:

{"@timestamp":"2017-05-21T19:13:38.127Z","@version":"1","host":"10.44.255.33","type":"paloalto","tags":["_grokparsefailure"]}

From this, I'm assuming my issue is that as I need a primary field with which to search on, I HAVE to define a field called "@timestamp". Can someone confirm this is the case and if it is the case, where is the Grok parse failure picking up the "@version" from?

Despite the Grok Debugger having no issues with my Grok statements, when I drop it into Logstash, I'm still getting parse errors. I'm not sure now of where to look as the only part of the config that is being "Grok'ed" is the part to remove the syslog headers as far as I understand it so if it works in Grok Debugger, I'm at a loss as to why it won't work in Logstash.

Any help you can provide would be appreciated! Apologies for the excessively long post....

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.