Hello,
I'm newby in ELK world.
I'm using beats (filebeat and others) with output to Logstash. LS only forward to ES ingest node (https://www.elastic.co/guide/en/logstash/current/use-ingest-pipelines.html):
input {
beats {
port => 5044
}
}
output {
if [@metadata][pipeline] {
elasticsearch {
hosts => "ES_hostname:9200"
manage_template => false
index => "%{[@metadata][beat]}-%{[@metadata][version]}"
pipeline => "%{[@metadata][pipeline]}"
user => "elastic"
password => "pass"
}
} else {
elasticsearch {
hosts => "es_hostname:9200"
manage_template => false
index => "%{[@metadata][beat]}-%{[@metadata][version]}"
user => "elastic"
password => "pass"
}
}
}
I've loaded first beats pipelines at ES, of course, so all seems to works well.
But I have doubts. With this configuration, are there any benefit using Logstash? or it's better to point beats direct to Elasticsearch.
I understand that this configuration are using ES ingest node (with de pipelines from beats preloaded) to make all the job and Logstash only formward the events but no processing events.
I'm true? What is the best architecture thinking that now I'm not doing any aditional processing to the events but.
I've 300 virtual machines in my infraestructure.
Thanks all!!