Logstash and index recreation

We had elasticsearch upgraded elasticsearch cluster and accidentally pointed it to the old data directories and ever since we were getting tons of dangling indices log entries. We built a new elasticsearch cluster and my hope was that once I'd point logstash to a new cluster most of the indices will be recreated from file entries. However the only index that was generated was same day index. I am trying to understand why.

Here is an example of logstash .conf file:

input {
file {
path => [
"/logs/xxx-*.log"
]
}
}
filter {
grok {
match => ["message","%{TIMESTAMP_ISO8601:timestamp} [%{DATA:class}] %{LOGLEVEL:severity} %{GREEDYDATA:message}" ]
overwrite => [ "message" ]
}
date {
match => [ "timestamp" , "yyyy-MM-dd HH:mm:ss,SSS" ]
target => "@timestamp"
}
}
output {
if "_grokparsefailure" not in [tags] {
elasticsearch {
hosts => [“xx.xx.xx.xxx:9200","xx.xx.xx.xxx:9200","xx.xx.xx.xxx:9200","xx.xx.xx.xxx:9200"]
index => “name-appengine-%{+YYYY.MM.dd}"
}
}
}

Are you asserting that the @timestamp field is in the past but the string interpolation of the index name uses the current date?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.