Hi,
I am learning Elastic so I have set it up in my home to prepare for when we start designing it at work.
I have a simple set up where logstash is listening for syslog messages. It works great with most hosts but my router seems to not be sending syslog in the correct format at I get _grokparsefailure from that host.
My router is 192.168.0.1 and I am trying to get anything that has that IP and not apply a filter like I do other things.
This is an example of what is output:
{
"@timestamp" => 2021-02-07T01:09:42.229Z,
"host" => "192.168.0.1",
"@version" => "1",
"type" => "syslog",
"message" => "<13> DHCPD: Recv REQUEST from 7A:1C:A2:5C:EF:54"
}
This is my config for this pipeline. I have commented out the normal filter and added a new filter as I am trying to get just 192.168.0.1. Please can you spot what I am doing wrong?
input {
udp {
port => 514
type => syslog
}
tcp {
port => 514
type => syslog
}
}
filter {
if '%{host}' == "192.168.0.1" {
grok{
add_field => [ 'received_at', '%{@timestamp}' ]
add_field => [ 'received_from', '%{host}' ]
add_field => [ 'filter_used', 'only gw' ]
}
date {
match => [ 'syslog_timestamp', 'MMM d HH:mm:ss', 'MMM dd HH:mm:ss' ]
}
}
# if [type] == 'syslog' and ['host'] != "192.168.0.1" {
# grok {
# match => { 'message' => '%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}' }
# add_field => [ 'received_at', '%{@timestamp}' ]
# add_field => [ 'received_from', '%{host}' ]
# }
# date {
# match => [ 'syslog_timestamp', 'MMM d HH:mm:ss', 'MMM dd HH:mm:ss' ]
# }
# }
}
output {
elasticsearch { hosts => ['elasticsearch:9200'] }
stdout { codec => rubydebug }
}