Logstash UDP plugin

I have Logstash, Kibana and Elasticsearch version 5.6.2 running on Windows 2012 R2. I am trying to read in live data over a udp port and I am getting no data. Here is my input in my config file:

input {
udp {
port => 000
}
}

The rest of the config file is fine (I had it running with input coming from the archived text files generated daily from the syslogs. I have tried adding type => "syslog", I have tried adding host => "00.00.00.0" but I am getting no data coming in to Kibana.

Thanks in advance!

You need to configure it to bind to a valid port, usually > 1024 unless you run as root, which is discouraged.

Bear with me cause I don't know much about this stuff, but the UDP port that is sending the firewall data is a port in the 500's. If this is an invalid port to use then how do I go about this?

Your input need to bind to a valid port number. You can the configure the sänder to send to this.

So if I change my input to say, port => 1024 and have the sender send the data from port 514 to 1024, I should start seeing data?

I have it listening at port 1025 (we reconfigured it) and I still get no data. Other suggestions?

Have you got anything sending to that port on that host?

We have the syslog events sending over port 1025 to logstash then to elasticsearch then kibana, nothing is showing up in kibana. I verified data is being sent from the port.

Have you verified that data is arriving to the port Logstash is listening to?

Yes, there are events coming in just not showing up in kibana

Enable a stdout output to verify what is or is not coming through. What does your config look like? Could events be dropped or go to the wrong index?

I had this config file running 100% but I was using the archived syslog text files as the input instead of the live feed over the UDP port. So the filter and output sections should both be fine, unless something else has to change when transitioning from the live syslog events over the archived text file version of the events. This is my config file now:

input {
   udp {
       port => 1025
        }
    }

filter {
   grok {
       named_captures_only => 'false'
       match => { "message" => "%{TIMESTAMP_ISO8601:timestamp}%{SPACE}Local4\.%{LOGLEVEL:level}%{SPACE}%{IP:ourIP}%{SPACE}%{CISCOTIMESTAMP:time}: %%{CISCOTAG:asa}%{GREEDYDATA:messageEnd}"}
   }
   grok {
       match => ["messageEnd", "\boutside:%{IP:ip}\b]
       tag_on_failure => []
   }
    if [ip] {
       geoip {
       source => "ip"
       }
    }
    if "outside" not in [messageEnd] {
       drop {}
    }
}

output {
    elasticsearch {
       hosts => ["localhost:9200"]
    }
    stdout {
       codec => rubydebug
    }
}

Also not getting any output to stdout. But if I look at the text file for today there are events coming in constantly.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.