Setting up an ELK Stack solution. The thing I am trying to do is to get Syslog data into Logstash (maybe this is more of a Kibana issue?). The Elk stack works fine with Netflow, getting 41 fields of data to play around with. Wanted some more so I configured the Syslog plugin and I see it receiving syslog messages, ton of them! But I cant see them when Browing Kibana. My logstash.conf looks like:
input { stdin { } }
output {
elasticsearch { hosts => ["localhost:9200"] }
stdout { codec => rubydebug }
}
input {
syslog {
host => "192.168.200.129"
port => 514
}
}
output {
elasticsearch { hosts => ["localhost:9200"] }
}
The Syslog messages looks like this, just an example:
{
"severity" => 4,
"@timestamp" => 2017-03-06T00:07:06.000Z,
"@version" => "1",
"host" => "192.168.200.1",
"program" => "kernel",
"message" => "[55560.308685] [WAN-TO-VLAN200-1-A] IN=eth1.200 OUT=eth0 MAC=00:15:5d:0a:8d:4f:00:15:5d:0a:8d:55:08:00 SRC=192.168.200.129 DST=172.xxx.22.xxx LEN=40 TOS=0x00 PREC=0x00 TTL=127 ID=91 DF PROTO=TCP SPT=59024 DPT=443 WINDOW=1019 RES=0x00 ACK FIN URGP=0 ",
"priority" => 4,
"logsource" => "asa01",
"facility" => 0,
"severity_label" => "Warning",
"timestamp" => "Mar 5 16:07:06",
"facility_label" => "kernel"
}
The Index pattern is logstash-*, and I am only seeing Time fields name from when I ran Netflow. But there are some new when added the Syslog plugin.
Im very new to this, Im sure its a layer 8 problem.