How to read JSON files and insert in Elastic Stack

I am new to Elastic. I have 10 independent docker containers each writing custom JSON files in a single directory. Each docker container writes file in a different format. Each file is a single line JSON along with timestamp. The files are getting written at the rate of 10 JSON files per docker container per second.

I want to:

  1. Read the JSON files and insert into Elastic.
  2. Delete the files that have been read.
  3. Files should be read in first come first serve (though not a mandatory requirement but good to have)
  4. Each JSON file type has to inserted into seperate index (i am thinking of having 10 indexes)
  5. I am open to use Filebeats (but I did not get any option to delete the files) or logstash.

Any help in this regards is highly appreciated.

Welcome to our community! :smiley:

You can use Filebeat to ship the data to Elasticsearch. There's nothing in the stack that will delete the data though, you will need to manage that yourself.

How do I know what files the Filebeat has processed. Maybe I can write a script to delete files but I need to know what has been processed.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.