I am working through a Udemy course to familiarize myself with Elasticsearch but I've ran into a wall when applying it to our current environment when it comes to Logstash. I am trying to send data from Cisco syslog but when I view it when running in the cli the result is grokparsefailure.
"type" => "syslog",
"message" => "<189>751: Jul 9 17:55:29.743: %SYS-5-CONFIG_I: Configured from console by admin on vty0 (172.16.5.52)",
"host" => "10.15.0.8",
"@version" => "1",
"@timestamp" => 2020-07-09T17:55:30.744Z,
"tags" => [
[0] "_grokparsefailure"
I have the following conf file:
input {
tcp {
port => 514
type => syslog
}
udp {
port => 514
type => syslog
}
}
filter {
if [type] == "syslog" {
grok {
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}" }
add_field => [ "received_at", "%{@timestamp}" ]
add_field => [ "received_from", "%{host}" ]
}
date {
match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
}
}
output {
elasticsearch { hosts => ["localhost:9200"] }
stdout { codec => rubydebug }
}
I know I'm new enough at this to not know what I don't know. My initial thoughts are maybe there needs to be a separate indices for this but then I am not sure how they would be tagged in order to separate from the others. Could it be that the format of my logs aren't matching with the expected input? I've seen talk of a Cisco module as well. I've been looking for answers but since I'm at that point of not knowing what I don't know, nothing ever seems to jump out as an answer.
Thank you.