I'm tryin to get a filter for this logfile with logstash: 2016-10-30T13:23:47+01:00 router.lan pppd[12566]: local IP address 1.2.3.4
The Grok debugger can resolve the fields nicely with this expression: %{DATE:datum}T%{TIME:time}%{ISO8601_TIMEZONE:timezone} %{HOSTNAME:hostname} %{WORD:service}%{GREEDYDATA:id fields} %{IP:wanip}
What I would like to get working with your help(after trying unsuccessfully for a day) is transfering the fields recognized by grok into elasticsearch via a logstash config for being able to filter in kibana for e.g. wanip
Hope you can help
The logstash looks like this at the moment:
input {
file {
path => "/var/log/rsyslog/router1.log"
start_position => "beginning"
type => "routerlog"
}
}
filter {
if [type] == "routerlog" {
grok {
match => { "message" => "%{DATE:datum} <<<<would like to add more custom fields - here? >>>>>>}
}
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
}
}
It's working if I remove the filter section
or just have one match in the filter.
But then it's just recognizing the timestamp and some fields I don't need.
I would like to have it recognizing the IP and fqdn.
How do I send the file nicely parsed (like grok does it easily ) to elasticsearch?
Instead of %{DATE:datum}T%{TIME:time}%{ISO8601_TIMEZONE:timezone} use %{TIMESTAMP_ISO8601:timestamp}. For the sake of the date filter you'll want to have the full timestamp in a single field anyway. The DATE pattern doesn't match yyyy-mm-dd dates.
Thanks for the tip Yes I probably just need one field for date.
My main problem still is: How do I get everything nicely filtered (with the grok expressions I have) into elasticsearch and Kibana?
I just want to get nicely searchable data in kibana - or do I apply the grok filter somewhere in kibana?
Sorry, noob questions I hope I can start digging through bigger logs soon...
My main problem still is: How do I get everything nicely filtered (with the grok expressions I have) into elasticsearch and Kibana?
If you extract the fields with the grok filter like you're doing in your first example (where you have %{HOSTNAME:hostname} etc) they will end up in Elasticsearch and will therefore be available in Kibana.
Using more than one DATA or GREEDYDATA in the same expression easily leads to weird matches. I instead suggest use of standard patterns for syslog messages:
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.