Hi all!
I got a newbie question for you
Right now, I have configured my logstash destination to my elastic only node like this:
ex:
if [type] == "syslog" {
elasticsearch {
hosts => localhost
index => "logstash-%{+YYYY.MM.dd}" }
}
So if I understand correctly, this will create a new index everyday, am I right?
Also, I have a curator script running that should clean the data older than 30 days:
---
actions:
1:
action: delete_indices
description: >-
Delete indices older than 45 days (based on index name), for logstash-
prefixed indices. Ignore the error if the filter does not result in an
actionable list of indices (ignore_empty_list) and exit cleanly.
options:
ignore_empty_list: True
timeout_override:
continue_if_exception: False
disable_action: False
filters:
- filtertype: pattern
kind: prefix
value: '^(logstash-).*$'
exclude:
- filtertype: age
source: name
direction: older
timestring: '%Y.%m.%d'
unit: days
unit_count: 7
exclude:
But I am not sure that I am doing this the right way since if I check the status of my elastic node, some shards (using cerebro) some shards do not seems to be assigned...
So should I create my index like "name-of-the-index" instead of "name-of-the-index-%{+YYYY.MM.dd}"?
Is my curator script will work anyway? Do my shards will be assigned this time?
I know,,. plenty of question..
Thanks for your comments!