How do I create an index per log file that I read?


(magdiel ) #1

Hi I have 3 types of log files that I am reading from. My setup is as follows: Filebeat-Logstash-Elasticsearch-Kibana.

The 3 types of logs are logs such as: Processing logs, Spike logs, Error logs.

I would like to be able to have an index per each type of log that I am reading from instead of the index always being logstash-'date'...
So for example when logstash is processing spike logs put all those logs under the index name "Spike", when it is processing processing logs put those under the index name "Processing", etc.

Is it possible to look at the path of the log and if there is a string in the path such as "spk" tell logstash its a spike log and put it under the spike index?

Ive been looking all over the place and cannot seem to find an answer to this. Thank you


(Magnus B├Ąck) #2
output {
  if [path] =~ /spk/ {
    elasticsearch {
      ...
      index => "spike"
    }
  }
}

You could also have the conditionals in the filter section and set a field with the desired index name so that you can reference that field with a %{fieldname} reference in the index option.


(system) #3

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.