Hi! How do you accept logs from Filebeam in Logstash? I have configured configurations in the yml file for output in Logstash.. In the settings, Logstash specified that without a filter, logs go directly to elasticsearch.
But something goes wrong and it doesn't give out indexes as if I were giving them directly to elasticsearch. All indexes are not reached.
The question is more about how you do it. If you don't mind show us some examples of configurations
It's not clear what you mean by that sorry.
Can you post your config?
# Sample Logstash configuration for creating a simple
# Beats -> Logstash -> Elasticsearch pipeline.
input {
beats {
port => 5044
}
}
filter {
if [type] == "syslog" {
grok {
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}" }
add_field => [ "received_at", "%{@timestamp}" ]
add_field => [ "received_from", "%{host}" ]
}
date {
match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
}
}
output {
elasticsearch {
hosts => ["192.168.10.183:9200"]
sniffing => true
manage_template => false
index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}
stdout {}
}
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.