I am new to Elastic. I have 10 independent docker containers each writing custom JSON files in a single directory. Each docker container writes file in a different format. Each file is a single line JSON along with timestamp. The files are getting written at the rate of 10 JSON files per docker container per second.
I want to:
- Read the JSON files and insert into Elastic.
- Delete the files that have been read.
- Files should be read in first come first serve (though not a mandatory requirement but good to have)
- Each JSON file type has to inserted into seperate index (i am thinking of having 10 indexes)
- I am open to use Filebeats (but I did not get any option to delete the files) or logstash.
Any help in this regards is highly appreciated.