Reparsing / processing old indexes with new separation files in Logstash depending on path logs

Hello,

I'm currently in the process to separate indexes depending on the path logs. Here we had to separate logs depending on the source files to better apply lifecycle policy. Currently we have one common index with logs type:

Note: here is only a sample of the full settings.
filebeat.yml:

- type: log
  paths:
    - /var/log/messages
  fields:
    log_type: logs

- type: log
  paths: 
     - /var/log/secure
  fields:
    log_type: security

logstash.yml

input { [...] }

filter{
mutate {
    copy => {
     "[fields][log_type]" => "[@metadata][log_type]"
    }
  }
}

output{
if [@metadata][log_type] {
    elasticsearch {
      hosts => "http://localhost:9200"
      manage_template => false
      index => "%{[@metadata][beat]}-%{[@metadata][log_type]}-%{+YYYY.MM.dd}"
    }
  }
}

Here since the new indexes are created successfully like filebeat-logs- or filebeat-security-, we still have old index like filebeat--.
My question here is that I would like to reprocess the document with this new logstash setup so that the old indexes will be split according to the "log_type".

Current Setup:

  • ELK node with Elasticsearch / Kibana / Logstash
  • Server sending logs with filebeat

Thank you in advance for your help and advice.

Benjamin

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.