I have Logstash, Kibana and Elasticsearch version 5.6.2 running on Windows 2012 R2. I am trying to read in live data over a udp port and I am getting no data. Here is my input in my config file:
input {
udp {
port => 000
}
}
The rest of the config file is fine (I had it running with input coming from the archived text files generated daily from the syslogs. I have tried adding type => "syslog", I have tried adding host => "00.00.00.0" but I am getting no data coming in to Kibana.
Bear with me cause I don't know much about this stuff, but the UDP port that is sending the firewall data is a port in the 500's. If this is an invalid port to use then how do I go about this?
We have the syslog events sending over port 1025 to logstash then to elasticsearch then kibana, nothing is showing up in kibana. I verified data is being sent from the port.
I had this config file running 100% but I was using the archived syslog text files as the input instead of the live feed over the UDP port. So the filter and output sections should both be fine, unless something else has to change when transitioning from the live syslog events over the archived text file version of the events. This is my config file now:
input {
udp {
port => 1025
}
}
filter {
grok {
named_captures_only => 'false'
match => { "message" => "%{TIMESTAMP_ISO8601:timestamp}%{SPACE}Local4\.%{LOGLEVEL:level}%{SPACE}%{IP:ourIP}%{SPACE}%{CISCOTIMESTAMP:time}: %%{CISCOTAG:asa}%{GREEDYDATA:messageEnd}"}
}
grok {
match => ["messageEnd", "\boutside:%{IP:ip}\b]
tag_on_failure => []
}
if [ip] {
geoip {
source => "ip"
}
}
if "outside" not in [messageEnd] {
drop {}
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
}
stdout {
codec => rubydebug
}
}
Also not getting any output to stdout. But if I look at the text file for today there are events coming in constantly.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.