How to process csv data from multiple folders using file beat and create separate index and move it to elastic search

Hi Team,

Our requirement is like this...

we have log files stored in different folders under master folder as csv format.
we want to send this to Elasticsearch using file beats as a part of automation.

we cant find solution how to handle multiple csv files using file beat from multiple folders and pull data from those folders and create different index for each csv file and move to Elasticsearch and visualize in kibana. its like streaming the log files.

Plz let us know we want to use logstash for this scenario

can anybody please help how to approach this process
Thanks in advance

I did something like this in the past.

Have a look at:

This is using Logstash.

I also have another "version" where I'm using filebeat for this and ingest pipelines.

The code is available at:

There's a video here which might help.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.