How to feed Logs from 11 different instances to Elasticsearch?

Hi All,

We have 11 instances running for an inhouse app. All of those produce logs in XML format.
We want to collect data from all 11 instances and feed to in Elasticsearch and Create Kibana Dashboard for the same.

I have worked on uploading Excel files in Elasticsearch and creating Dashboard. But my knowledge is limited when it comes to doing th same with logs.

Can anyone give me a walkthrough for the process to be followed so I can pitch in idea to my team?

I get all confused ans messed up like.

Do we need to install Elasticsearch in all 11 machines?

Do I need to deploy logstash on 11 machines feed in all 11 logstash pipeline to ES?
Or Do I need to install Beats in 11 machines and feed its data in logstash which further feeds it into ElasticSearch or directly to Elasticsearch?
if so in which machine will 1 Elasticsearch reside and logstash(if its there)?

Yes, this is a very common approach.

Typically a separate machine as it can be CPU and I/O intensive.

IMHO...
Kinesis Firehose can collect logs from all your 11 instances.
Give the endpoint for Kinesis as Logstash.
Logstash will parse logs to ElasticSearch.
Kibana will fetch these from Elasticsearch.

Hi @groverjatin17,

You cab achieve this using filebeat. It can send data directly to elasticsearch cluster.

Thanks,
Shrikant

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.