Hello everyone,
I am completely new to Logstash, Kibana and Elasticsearch.
However I had no issues setting up the server and forwarding logfiles from Windows and Linux machines. What is giving me a hard time is getting log files from a HP ProCurve switch into Logstash.
I configured the switch via the "logging" command to send the logs to my Logstash server, however those are not showing up in Kibana. I tried several configuration options in the lumberjack (not needed as far as I see it) and the syslog conf-files therefore they are be a bit bloated as I got a little desperate.
Here is the configuration of the lumberjack conf file:
input { lumberjack { port => 5043 type => "logs" ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt" ssl_key => "/etc/pki/tls/private/logstash-forwarder.key" } }----------input {
tcp {
codec => json_lines { charset => CP1252 }
port => "3515"
tags => [ "tcpjson" ]
}
}input {
syslog {
port => 1514
}
}input {
udp {
port => "514"
type => "Procurve"
}
}filter {
date {
locale => "en"
timezone => "Etc/GMT"
match => [ "EventTime", "YYYY-MM-dd HH:mm:ss" ]
}
}
output {
elasticsearch {
host => localhost
}
stdout { codec => rubydebug }
}
And here is my configuration of the syslog conf file:
input { udp { port => "514" type => "Procurve" } }----------filter { if [type] == "syslog" { grok { match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}" } add_field => [ "received_at", "%{@timestamp}" ] add_field => [ "received_from", "%{host}" ] } syslog_pri { } date { match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ] } }
if [type] == "Procurve" {
if [message] =~ "last message repeated" {
grok {
match => [ "message", "<[0-9]*>%{SYSLOGTIMESTAMP:timestamp} %{IPORHOST:hostname} %{GREEDYDATA:msg-repeated}" ]
}} else { grok { match => [ "message", "<[0-9]*>%{SYSLOGTIMESTAMP:timestamp} %{IPORHOST:hostname} %{DATA:switch-category}:\s+%{GREEDYDATA:switch-message}" ] } } if [switch-category] =~ "ports|FFI" { if [switch-category] =~ "ports" { mutate { add_tag => [ "layer1" ] } } if [switch-category] =~ "FFI" { mutate { add_tag => [ "layer2" ] } } grok { match => [ "switch-message", "port %{DATA:port}[- ]%{GREEDYDATA:port-message}" ] } } date { match => [ "timestamp", "MMM dd HH:mm:ss", "MMM d HH:mm:ss" ] timezone => [ "America/Los_Angeles" ] }
}
}
As said, they are bloated beyond usefulness but well ... you never know.
Logstash is running with root privileges so it should be able to listen on port 514.
Any suggestion would be helpful there.
Kind thanks in advance,
Chris