Unable to configure Index Lifecycle Management (ILM) with Logstash Elasticsearch output

I'm simply trying to get ILM to work at the moment with Logstash, Elasticsearch, and Filebeat. Or any other beat.

This is my elasticsearch output block:

output {
    elasticsearch {
        hosts => ["localhost:9200"]
        manage_template => false
        ilm_enabled => true
        ilm_rollover_alias => "filebeat-rollover"
        ilm_pattern => "filebeat-7.4.2-{now/d}-000001"
        ilm_policy => "ilm-default-policy"
    }
}

The policy ilm-default-policy does exist.

This is what I get in my Logstash logs:

[2019-11-19T09:36:03,208][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.1.1"}
[2019-11-19T09:36:09,054][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2019-11-19T09:36:09,363][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2019-11-19T09:36:09,409][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>7}
[2019-11-19T09:36:09,412][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2019-11-19T09:36:09,435][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2019-11-19T09:36:09,665][FATAL][logstash.runner          ] An unexpected error occurred! {:error=>#<LogStash::Outputs::ElasticSearch::HttpClient::Pool::BadResponseCodeError: LogStash::Outputs::ElasticSearch::HttpClient::Pool::BadResponseCodeError>, :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client/manticore_adapter.rb:80:in `perform_request'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:291:in `perform_request_to_url'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:278:in `block in perform_request'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:373:in `with_connection'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:277:in `perform_request'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:285:in `block in Pool'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:341:in `exists?'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:359:in `rollover_alias_exists?'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/ilm.rb:91:in `maybe_create_rollover_alias'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/ilm.rb:10:in `setup_ilm'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/common.rb:52:in `block in setup_after_successful_connection'"]}
[2019-11-19T09:36:09,696][INFO ][logstash.javapipeline    ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1000, :thread=>"#<Thread:0xffca1e9 run>"}
[2019-11-19T09:36:09,754][ERROR][org.logstash.Logstash    ] java.lang.IllegalStateException: Logstash stopped processing because of an error: (SystemExit) exit
1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.