If i want to centralized the logs in my application Environment. I have installed ELK (Eleasticsearch, Logstash and Kibana) and using Grok file input filter which is working fine on my local machine.
When i want to use it as centralized Solution, i can install it on a VM and on all my applictaion servers Filebeat service which can pick up the data from logfiles and send to Elasticsearch. Kibana is based on Elasticsearch and can show me logs on the basis of Indexes created in Elasticsearch.
Now question is, if its working between Filebeat, Elasticsearch and Kibana , do i still need Logstash? if yes what will be the job of Logstash?
I asked myself this very thing about 10 days ago.
Turns out I needed it for geo data in my input on ELK:
patterns_dir => ["/opt/logstash/vendor/patterns"]
and I couldn't find one anywhere else.
I don't know what else it may affect, but that is what I noticed
Thanks for precise information. Actually i was looking for to add Filter property in Filebeat so that my nodes send data directly to ES in a format which i am expecting for kibana. But it seems like filebeat does not offer grok filtering method, according to following thread.
but don't you think its enough to use filtering at prospector level can done the job for me. Here is my previous thread (where u already have answered a lot ) which show how my logs look like and what grok filter i have applied to parse it.
Whats your Suggestion? do i still need a Logstash ? i have not yet tested Filebeat and prospector level filtering, that's why asking community about there experiences ?
You might be able to use the ingest node feature of ES to parse those logs, but I haven't used it myself. Filebeat and Elasticsearch without the ingest feature won't be sufficient.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.