Multiple log files per index -- need to create one index per one log file

Hi

My current setting is filebeat -> Elasticsearch -> Kibana.

Current behavior:

  1. Send one file to the work directory to be ingested by filebeat and send to the Elasticsearch
  2. log output is sent to Elasticsearch without issues.
  3. Send another file to the same work directory and expect Filebeat to ingest and send the log data to Elasticsearch -- no issue.
  4. The new log data is appended to the same index and when visualize the data, I can see both the old and new data.

Desired behavior:

  1. Send one file to the work directory to be ingested by filebeat and send to the Elasticsearch
  2. log output is sent to Elasticsearch without issues.
  3. Send another file to the same work directory and expect Filebeat to ingest and send the log data to Elasticsearch -- no issue.
  4. a new index is created for the new log file or the old and the new data can be handle separately.

My question is:

a. How can I configure the index template to create a new index file per new log file?
My goal is to find a way to separate the data from the 2 different log files.
b. If filebeat does not support different index per new log file, can we use another way to separate new and old data (docs) from one common index.

Best Regards

Hung Le

Having a separate index per log file is a bad idea as it will lead to a lot of small indices and shards which is inefficient, will cause problems and not scale well.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.