Forgive me. Used ~v2 of the ELK and jumping forward and believe it's something simple but it's not jumping out at me after staring at the screen for a couple of days now. Get the same ~error from curator at the command line as what's recorded in debug mode in the log file and the action to close is not performed. Can someone please take a quick pass and advise what I'm doing wrong? Using v5.5.1 of curator and v6.2.2 of ES.
/usr/bin/curator --config /etc/elasticsearch-curator/curator.yml /etc/elasticsearch-curator/action_close.yml
Response from debug and same general error from the command-line response. The rest of the debug looks "good" as the indexes "remain in the list", etc. and no errors seen until the very end. Can provide rest of debug log if requested.
2018-03-27 09:35:03,401 ERROR curator.cli run:184 Failed to complete action: close. <class 'curator.exceptions.FailedExecution'>: Exception encountered. Rerun with loglevel DEBUG and/or check Elasticsearch logs for more information. Exception: TransportError(403, 'cluster_block_exception', 'blocked by: [FORBIDDEN/12/index read-only / allow delete (api)];')
curator.yml
client:
hosts:
- 127.0.0.1
port: 9200
url_prefix:
use_ssl: False
certificate:
client_cert:
client_key:
ssl_no_validate: False
http_auth:
timeout: 30
master_only: False
logging:
loglevel: DEBUG
logfile: /var/log/elasticsearch-curator/debug.log
logformat: default
blacklist: ['elasticsearch', 'urllib3']
action_close.yml
actions:
1:
action: close
description: >-
Close indices older than 2 days (based on index name), for logstash-
prefixed indices.
options:
delete_aliases: False
timeout_override:
continue_if_exception: False
disable_action: False
filters:
- filtertype: pattern
kind: prefix
value: logstash-
exclude: False
- filtertype: age
source: name
direction: older
timestring: '%Y.%m.%d'
unit: days
unit_count: 2
exclude:
GET /_cat/indices?v
health status index uuid pri rep docs.count docs.deleted store.size pri.store.size
yellow open logstash-2018.03.22 5kZGtUBPSH-AlUCQO0Xg3w 5 1 1376736 0 1gb 1gb
yellow open logstash-2018.03.12 FEev1P9ySA6XqUvjFY65Ww 5 1 5909374 0 3.9gb 3.9gb
yellow open logstash-2018.03.11 sRJBAMlmTvyEpxshTWLEtA 5 1 3258232 0 2.2gb 2.2gb
yellow open logstash-2018.03.08 XTO86zklToiYiisPIRpX6w 5 1 305 0 617.7kb 617.7kb
green open .monitoring-alerts-6 nTLlITQdTL6P930AQHB1Lw 1 0 2 1 12.9kb 12.9kb
green open .triggered_watches DRW4zTWERtqH6SyJENuIDQ 1 0 0 0 5.1kb 5.1kb
green open .watches -VP137M7ToiwJ7TCua8FcQ 1 0 6 0 24.5kb 24.5kb
close .watcher-history-7-2018.03.22 C0alcRK9QtGJCbzLKIjWoQ
yellow open logstash-2018.03.23 kKCz3ve1SFeLeTxtt4O-Wg 5 1 909987 0 706.9mb 706.9mb
yellow open logstash-2018.03.09 quU6dgvES5mqup828g1NRw 5 1 579 0 977.9kb 977.9kb
green open .monitoring-es-6-2018.03.22 bM8BUjstTiOUPXzIR4WOSQ 1 0 925 0 479.5kb 479.5kb
yellow open logstash-2018.03.26 L-xEeawmQoueFuktZTZmZg 5 1 4138979 0 3gb 3gb
yellow open logstash-2018.03.15 QadOJOHhRPyRTSsXjcQObQ 5 1 1811456 0 1.3gb 1.3gb
yellow open logstash-2018.03.27 660-4JaHS4eSo5R6bIGryQ 5 1 1876957 0 1.2gb 1.2gb
green open .kibana m9Lh3v-MSSKCFnlqKe2iig 1 0 11 5 71.6kb 71.6kb
yellow open logstash-2018.03.13 AcHi8k9LTtangpOEi_D3sw 5 1 28971 0 25.9mb 25.9mb
yellow open logstash-2018.03.10 xbDIUicyQSuRgLyN4a1Wqw 5 1 322 0 484.9kb 484.9kb