Could not execute action: PipelineAction::Create<main> after xpack.security enabled

Logstash works fine for me until I enable xpack security in elasticsearch.yml. When I do I get the following error from any conf that creates indices.

[ERROR] 2020-10-30 15:52:06.741 [Converge PipelineAction::Create] agent - Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create, action_result: false", :backtrace=>nil}

My confs that are not creating new indices and use a set index name still work fine.

Here is an example of a failing conf

    output {
   elasticsearch {
      hosts => ["localhost:9200"]
      index => "adminskillchange-%{+YYYY.MM.dd}"
      document_id => "%{skchg_acd}_%{skchg_user}_%{skchg_logid}_%{skchg_date}"
      manage_template => false
      user => "logstash_internal"
      password => "********"
   }
   stdout { codec => rubydebug }
}

I have tried setting cluster privileges and indices privileges to all and it's doesn't help.

Currently my role for logstash_internal has:
Cluster: monitor, manage_index_templates
Indices: write, delete, create_index, create

I have the following in my logstash.yml
xpack.monitoring.enabled: true
xpack.monitoring.elasticsearch.username: logstash-internal
xpack.monitoring.elasticsearch.password: ******

OS: CentOS ELK: 7.9.3

Any assistance is greatly appreciated.

Try setting log.level to debug. You may get a more informative message.

[2020-10-31T20:02:30,831][ERROR][logstash.javapipeline    ][main] Pipeline error {:pipeline_id=>"main", :exception=>#<Elasticsearch::Transport::Transport::Errors::Unauthorized: [401] >, :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/elasticsearch-transport-5.0.5/lib/elasticsearch/transport/transport/base.rb:202:in `__raise_transport_error'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/elasticsearch-transport-5.0.5/lib/elasticsearch/transport/transport/base.rb:319:in `perform_request'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/elasticsearch-transport-5.0.5/lib/elasticsearch/transport/transport/http/manticore.rb:67:in `perform_request'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/elasticsearch-transport-5.0.5/lib/elasticsearch/transport/client.rb:131:in `perform_request'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/elasticsearch-api-5.0.5/lib/elasticsearch/api/actions/ping.rb:20:in `ping'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-elasticsearch-3.9.0/lib/logstash/filters/elasticsearch.rb:310:in `test_connection!'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-elasticsearch-3.9.0/lib/logstash/filters/elasticsearch.rb:117:in `register'", "org/logstash/config/ir/compiler/AbstractFilterDelegatorExt.java:75:in `register'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:228:in `block in register_plugins'", "org/jruby/RubyArray.java:1809:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:227:in `register_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:586:in `maybe_setup_out_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:240:in `start_workers'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:185:in `run'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:137:in `block in start'"], "pipeline.sources"=>["/etc/logstash/conf.d/ech.conf"], :thread=>"#<Thread:0x753a8cd4 run>"}
[2020-10-31T20:02:30,844][INFO ][logstash.javapipeline    ][main] Pipeline terminated {"pipeline.id"=>"main"}
[2020-10-31T20:02:30,862][ERROR][logstash.agent           ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create<main>, action_result: false", :backtrace=>nil}

I am guessing the error lies in the 401. However I am not sure why I am getting this. I have verified with curl that the login/password are good.

Tail of my conf looks like...

elasticsearch {
      hosts => [ "http://localhost:9200"]
      index => "ech-%{+YYYY.MM.dd}"
      document_id => "%{acd}_%{callid}_%{segment}_%{ucid}"
      manage_template => false
      user => "logstash_internal"
      password => "******"
   }

logstash.yml:

xpack.monitoring.enabled: true
xpack.monitoring.elasticsearch.username: logstash_internal
xpack.monitoring.elasticsearch.password: ******
xpack.monitoring.elasticsearch.hosts: ["http://localhost:9200"]

elasticsearch.yml:

xpack:
   security:
      enabled: true

discovery.type: single-node

Agreed. The code is getting an exception here. Whilst the username and password may be valid, perhaps they do not have the appropriate permissions/role. It is really an elasticsearch question now, not a logstash question.

I finally figured out what is causing the error. I have the following higher up in my conf file.

  elasticsearch {
     hosts => ["localhost:9200"]
     index => ["synonyms"]
     result_size => 1
     query => "_id:%{[acd]}_split_%{[split3]}"
     fields => { "item_name" => "split3name"}
  }

These are kicking out the 401s. If I add

user => "${ES_USER}"
password => "${ES_PWD}"

final version

elasticsearch {
      hosts => ["localhost:9200"]
      index => ["synonyms"]
      result_size => 1
      query => "_id:%{[acd]}_split_%{[split1]}"
      fields => { "item_name" => "split1name"}
      user => "${ES_USER}"
      password => "${ES_PWD}"
   }

It starts working. I really appreciate your help. Thank you.

A 401 is an authentication failure. There is insufficient information for anyone to diagnose this from what you have posted, but checking the Elasticsearch logs might provide more details.