Hi there, I have many log files for scraped web-pages for online shopping stores, I ingest and process them using logstash, but every log file takes an elasticsearch index alone, I think this is not very efficient and not helping, as I want every store log files to be grouped together.
How can I tweak the indexing process?
I am thinking about creating a single index, a type for each store name and an ID for each log file (daily generated).
Do you think this is the best approach?
Here's how my logstash configuration looks like (http://codepad.org/aWFkdnfE).