Logstash Index not in Kibana


(Erik Heskes) #1

Hi there,

I've noticed that mu Indexes aren't being added to Kibana anymore. I do have changed my Logstash conf files in the meantime. But still then I would only expect GROK parse failures and not my index to dissapear..
This is the config:

input {
udp {
port => 5514
type => syslog
}

tcp {
port => 5514
type => syslog
}
}

filter {
if [type] == "syslog" {
grok {
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:[%{POSINT:syslog_pid}])?: %{GREEDYDATA:syslog_message}" }
add_field => [ "received_at", "%{@timestamp}" ]
add_field => [ "received_from", "%{host}" ]
}
date {
match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
}
}

output {
elasticsearch { hosts => ["localhost:9200"] index => "logstash-syslog" }
stdout { codec => rubydebug }
}


(Erik Heskes) #2

All seems alright, no more ideas?

[TCPDUMP shows incoming data on correct port]
[Logstash/Elasticsearch/Kibana instances are running]
[output debug logs look also good]


(Magnus B├Ąck) #3

So you've verified that the ES instance on localhost:9200 doesn't have a logstash-syslog index with the current data? How did you reach that conclusion?


(Erik Heskes) #4

This one has been solved, it was a local firewall issue :blush:


(system) #5

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.