I am working on an archiving product and we have been looking to replace our indexing/search tool with Elasticsearch.
Our current process is roughly like this:
- Incoming Data is processed
- Data is archived as .zip or .tar.gz
- Data is uploaded to a shared folder
- Our tool scan the shared folder on an interval
- grab the new archive and ingest it to our indexing tool
- Delete the archive from the shared folder
I have tried looking into Beats and Logstash and also the new "ingest node" feature in Elasticsearch 5.0. So far I did not come across a natural solution to this.
Is this something that is doable without much work OOTB ?
I am not familiar with Ruby and it seems that most of the plugins/filters are writtern in it.