I have just installed a new ELK stack instance but I cant figure out what I am missing. By default filebeats were being sent and that worked. I dont intend to use filebeats as since Im mainly collecting syslogs from network devices such as routers that wont have a filebeat client. I have built up logstash to accept UDP from the source with an output to Elasticsearch as well as to a file to verify the logstash piece is working. Data seems be getting indexed for logstash on elasticsearch but nothing shows up in the discover tab.
here is mylogstash config
Blockquote
Sample Logstash configuration for creating a simple
Beats -> Logstash -> Elasticsearch pipeline.
input {
tcp {
port => 5002
type => "cradlepoint"
}
udp {
port => 5002
type => "cradlepoint"
}
beats {
port => 5044
}
file {
path => "/var/log/syslog-ng"
start_position => "beginning"
type => "syslog"
}
}
filter {
if [type] == "syslog-ng"{
grok {
match => { "message" => "%{SYSLOGLINE}"}
}
date {
match => [ "timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
}
}
else if [type] == "syslog" {
grok {
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:[%{POSINT:syslog_pid}])?: %{GREEDYDATA:syslog_message}" }
add_field => [ "received_at", "%{@timestamp}" ]
add_field => [ "received_from", "%{host}" ]
}
date {
match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
}
#}
output {
#stdout { codec => rubydebug}
elasticsearch {
hosts => "localhost:9200"
manage_template => false
index => "syslog-ng%{+YYY.MM.dd}"
index => "logstash-%{+YYYY.MM.dd}"
index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
index => "%{[@metadata][index]}"
document_type => "system_logs"
}
file { path => "/tmp/logstash.log"}
}
Blockquote