How to change log files indexing between Logstash and Elasticsearch

Hi there, I have many log files for scraped web-pages for online shopping stores, I ingest and process them using logstash, but every log file takes an elasticsearch index alone, I think this is not very efficient and not helping, as I want every store log files to be grouped together.

How can I tweak the indexing process?

I am thinking about creating a single index, a type for each store name and an ID for each log file (daily generated).
Do you think this is the best approach?

Here's how my logstash configuration looks like (http://codepad.org/aWFkdnfE).

A separate index for each store probably doesn't make sense, but it depends on how many stores you're indexing. Start with a single index.

Why use a separate type? I'd probably use a single type (especially if the documents have the same schema) and a separate field to indicate the name of the store.

Yes, that's what I meant, single index for all stores, but how would I be able to separate log files for each store on Kibana? (e.g: for every store, I want to create a line chart where the X-axis is date, Y-axis is number of scraped products from i-th store in this date).

As I said, use a separate field to indicate the name of the store. Use that field for filtering and/or aggregation.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.