Logstash grok filter with syslog

Hi,

I'm tryin to get a filter for this logfile with logstash:
2016-10-30T13:23:47+01:00 router.lan pppd[12566]: local IP address 1.2.3.4

The Grok debugger can resolve the fields nicely with this expression:
%{DATE:datum}T%{TIME:time}%{ISO8601_TIMEZONE:timezone} %{HOSTNAME:hostname} %{WORD:service}%{GREEDYDATA:id fields} %{IP:wanip}

What I would like to get working with your help(after trying unsuccessfully for a day) is transfering the fields recognized by grok into elasticsearch via a logstash config for being able to filter in kibana for e.g. wanip

Hope you can help :slight_smile:

The logstash looks like this at the moment:

input {
   file {
       path => "/var/log/rsyslog/router1.log"
       start_position => "beginning"
       type => "routerlog"
   }
}

filter {
if [type] == "routerlog" {
  grok {
         match => { "message" => "%{DATE:datum}    <<<<would like to add more custom fields - here? >>>>>>}
}
}
}

output {
elasticsearch {
    hosts => ["localhost:9200"]
}
}

Does this not work? What happens? What version are you on?

It's working if I remove the filter section
or just have one match in the filter.
But then it's just recognizing the timestamp and some fields I don't need.
I would like to have it recognizing the IP and fqdn.
How do I send the file nicely parsed (like grok does it easily :slight_smile: ) to elasticsearch?

I'm on Version 5.0.0 with all components.

Instead of %{DATE:datum}T%{TIME:time}%{ISO8601_TIMEZONE:timezone} use %{TIMESTAMP_ISO8601:timestamp}. For the sake of the date filter you'll want to have the full timestamp in a single field anyway. The DATE pattern doesn't match yyyy-mm-dd dates.

Thanks for the tip :slight_smile: Yes I probably just need one field for date.
My main problem still is: How do I get everything nicely filtered (with the grok expressions I have) into elasticsearch and Kibana?
I just want to get nicely searchable data in kibana - or do I apply the grok filter somewhere in kibana?
Sorry, noob questions :slight_smile: I hope I can start digging through bigger logs soon...

Start with this --
%{TIMESTAMP_ISO8601:timestamp} %{DATA:router} %{DATA:proc}: %{GREEDYDATA:msg}

If you wish to play with grok a bit more and refine your filter, you can use this --

http://grokconstructor.appspot.com/do/match#result

My main problem still is: How do I get everything nicely filtered (with the grok expressions I have) into elasticsearch and Kibana?

If you extract the fields with the grok filter like you're doing in your first example (where you have %{HOSTNAME:hostname} etc) they will end up in Elasticsearch and will therefore be available in Kibana.

%{TIMESTAMP_ISO8601:timestamp} %{DATA:router} %{DATA:proc}\: %{GREEDYDATA:msg}

Using more than one DATA or GREEDYDATA in the same expression easily leads to weird matches. I instead suggest use of standard patterns for syslog messages:

%{TIMESTAMP_ISO8601:timestamp} %{SYSLOGHOST:host} %{SYSLOGPROG}: %{GREEDYDATA:msg}

Thanks all! I went with the last one from magnus. It has no Field for IP but I'll get there :slight_smile:
It's a nice tool with a lot of potential.