Logstash not connecting to elasticsearch after enabling xpack.security on single.node

Hello,

I have configured elastic stack on Docker with below docker-compose on single node model. Initially disabled xpack-security and everything working fine. Post that enabled, xpack-security and restarted stack. Logstash not coming up, not able to connect to elasticsearch output. Kindly check all below details and advice.

docker-compose.yml

version: '3.2'

services:
  elasticsearch:
    build:
      context: elasticsearch/
      args:
        ELK_VERSION: $ELK_VERSION
    volumes:
      - type: bind
        source: ./elasticsearch/config/elasticsearch.yml
        target: /usr/share/elasticsearch/config/elasticsearch.yml
        read_only: true
      - type: volume
        source: elasticsearch
        target: /usr/share/elasticsearch/data     
      - type: bind
        source: /opt/hc/esbackup
        target: /opt/hc/esbackup
    ports:
      - "9200:9200"
      - "9300:9300"
    environment:
      ES_JAVA_OPTS: "-Xmx2g -Xms2g"
    networks:
      - elk

  logstash:
    build:
      context: logstash/
      args:
        ELK_VERSION: $ELK_VERSION
    volumes:
      - type: bind
        source: ./logstash/config/
        target: /usr/share/logstash/config/
        read_only: true
      - type: bind
        source: ./logstash/pipeline
        target: /usr/share/logstash/pipeline
        read_only: true
    ports:
      - "5000:5000/tcp"
      - "5000:5000/udp"
      - "9600:9600"
    environment:
      LS_JAVA_OPTS: "-Xmx256m -Xms256m"
    networks:
      - elk
    depends_on:
      - elasticsearch

  kibana:
    build:
      context: kibana/
      args:
        ELK_VERSION: $ELK_VERSION
    volumes:
      - type: bind
        source: ./kibana/config/kibana.yml
        target: /usr/share/kibana/config/kibana.yml
        read_only: true
    ports:
      - "5601:5601"
    networks:
      - elk
    depends_on:
      - elasticsearch

networks:
  elk:
    driver: bridge

volumes:
  elasticsearch:
  elasticsearch_backup:

elasticsearch.yml

cluster.name: "docker-cluster"
network.host: 0.0.0.0
path.repo: ["/opt/hc/esbackup"]
bootstrap.memory_lock: true
discovery.type: single-node

xpack.license.self_generated.type: basic
xpack.security.enabled: true
xpack.monitoring.collection.enabled: true

logstash.yml

http.host: "0.0.0.0"
xpack.monitoring.elasticsearch.hosts: [ "http://elasticsearch:9200" ]

xpack.monitoring.enabled: true
xpack.monitoring.elasticsearch.username: logstash_system
xpack.monitoring.elasticsearch.password: NE2yRdCVBN6DlES2

logstash.output.conf

elasticsearch {
      hosts => "elasticsearch:9200"
	  user => "elastic"
      password => "pp9SY06zlDcBlbmMP"
      manage_template => false
      #index => "%{[@metadata][beat]}-%{[@metadata][index]}-%{+YYYY.MM.dd}"
      index => "%{[@metadata][target_index]}"
    }

kibana.yml

server.name: kibana
server.host: 0.0.0.0
elasticsearch.hosts: [ "http://elasticsearch:9200" ]
monitoring.ui.container.elasticsearch.enabled: true
elasticsearch.requestTimeout: 120000

elasticsearch.username: kibana
elasticsearch.password: oA8AEX1eKAojZgT7TjDM

error: logstash exited with code 1

Thanks

Hey @Bhanu_Praveen,

could you please post here the Logstash logs output?

I supposed that you're using elastic built-in users, right? Did you set the password for logstash_system by running elasticsearch-setup-passwords ? Can you try and authenticate with your logstash_system user with

curl -u logstash_system http://127.0.0.1:9200

Make sure that you're using the proper write access role to the index patterns you want it to be able to write to and it would be perfect if you share your full configuration please and make sure that you can authenticate with.

Hi Francis,

ya, i am using built in users generated with below command.

./bin/elasticsearch-setup-passwords interactive

Able to authenticate with user logstash_system:

curl -u logstash_system http://127.0.0.1:9200
Enter host password for user 'logstash_system':
{
  "name" : "4a8d778449f0",
  "cluster_name" : "docker-cluster",
  "cluster_uuid" : "s-J_B794QHeLC_IAPrIaAw",
  "version" : {
    "number" : "7.8.0",
    "build_flavor" : "default",
    "build_type" : "docker",
    "build_hash" : "757314695644ea9a1dc2fecd26d1a43856725e65",
    "build_date" : "2020-06-14T19:35:50.234439Z",
    "build_snapshot" : false,
    "lucene_version" : "8.5.1",
    "minimum_wire_compatibility_version" : "6.8.0",
    "minimum_index_compatibility_version" : "6.0.0-beta1"
  },
  "tagline" : "You Know, for Search"
}

If i use below command, able to retrieve all index's:

curl -u elastic:pp9zuwSY06zlDcBlP localhost:9200/_cat/indices

Tried to check access to one of the index with below command:

curl -u elastic:pp9zuwSY06zlDcBlP http://127.0.0.1:9200/*hc-gc*

Able to retrieve all info. Write access to be given to "logstash_system" ?? or "Elastic" user?
I have shared all configurations above on my question. Please check. For logstash no logs other than exited with code 1.

Hello,

Sorry for my bad inputs. Tried to enable logging logstash output somehow and got below output. Its "Elasticsearch::Transport::Transport::Errors::Unauthorized: [401]"
Not sure on this? Please advice

logstash_1       | [INFO ] 2020-07-28 05:43:24.655 [LogStash::Runner] runner - Starting Logstash {"logstash.version"=>"7.8.0", "jruby.version"=>"jruby 9.2.11.1 (2.5.7) 2020-03-25 b1f55b1a40 OpenJDK 64-Bit Server VM 11.0.7+10-LTS on 11.0.7+10-LTS +indy +jit [linux-x86_64]"}
logstash_1       | [INFO ] 2020-07-28 05:43:24.745 [LogStash::Runner] agent - No persistent UUID file found. Generating new UUID {:uuid=>"49f1ff5a-391b-4db1-947b-43a1c51b374e", :path=>"/usr/share/logstash/data/uuid"}
logstash_1       | [WARN ] 2020-07-28 05:43:26.728 [LogStash::Runner] pipelineregisterhook - Internal collectors option for Logstash monitoring is deprecated and targeted for removal in the next major version.
logstash_1       | Please configure Metricbeat to monitor Logstash. Documentation can be found at:
logstash_1       | https://www.elastic.co/guide/en/logstash/current/monitoring-with-metricbeat.html
logstash_1       | [INFO ] 2020-07-28 05:43:30.205 [LogStash::Runner] licensereader - Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://elastic:xxxxxx@elasticsearch:9200/]}}
logstash_1       | [WARN ] 2020-07-28 05:43:32.950 [LogStash::Runner] licensereader - Restored connection to ES instance {:url=>"http://elastic:xxxxxx@elasticsearch:9200/"}
logstash_1       | [INFO ] 2020-07-28 05:43:33.761 [LogStash::Runner] licensereader - ES Output version determined {:es_version=>7}
logstash_1       | [WARN ] 2020-07-28 05:43:33.764 [LogStash::Runner] licensereader - Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
logstash_1       | [INFO ] 2020-07-28 05:43:34.568 [LogStash::Runner] internalpipelinesource - Monitoring License OK
logstash_1       | [INFO ] 2020-07-28 05:43:34.569 [LogStash::Runner] internalpipelinesource - Validated license for monitoring. Enabling monitoring pipeline.
logstash_1       | [INFO ] 2020-07-28 05:43:46.516 [Converge PipelineAction::Create<main>] Reflections - Reflections took 143 ms to scan 1 urls, producing 21 keys and 41 values
logstash_1       | [INFO ] 2020-07-28 05:43:50.055 [[main]-pipeline-manager] elasticsearch - Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://elastic:xxxxxx@elasticsearch:9200/]}}
logstash_1       | [WARN ] 2020-07-28 05:43:50.140 [[main]-pipeline-manager] elasticsearch - Restored connection to ES instance {:url=>"http://elastic:xxxxxx@elasticsearch:9200/"}
logstash_1       | [INFO ] 2020-07-28 05:43:50.168 [[main]-pipeline-manager] elasticsearch - ES Output version determined {:es_version=>7}
logstash_1       | [WARN ] 2020-07-28 05:43:50.169 [[main]-pipeline-manager] elasticsearch - Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
logstash_1       | [INFO ] 2020-07-28 05:43:50.352 [[main]-pipeline-manager] elasticsearch - New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//elasticsearch:9200"]}
logstash_1       | [INFO ] 2020-07-28 05:43:55.271 [[main]-pipeline-manager] elasticsearch - New ElasticSearch filter client {:hosts=>["pla11010:9200"]}
logstash_1       | [ERROR] 2020-07-28 05:43:55.938 [[main]-pipeline-manager] javapipeline - Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#<Elasticsearch::Transport::Transport::Errors::Unauthorized: [401] >, :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/elasticsearch-transport-5.0.5/lib/elasticsearch/transport/transport/base.rb:202:in `__raise_transport_error'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/elasticsearch-transport-5.0.5/lib/elasticsearch/transport/transport/base.rb:319:in `perform_request'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/elasticsearch-transport-5.0.5/lib/elasticsearch/transport/transport/http/manticore.rb:67:in `perform_request'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/elasticsearch-transport-5.0.5/lib/elasticsearch/transport/client.rb:131:in `perform_request'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/elasticsearch-api-5.0.5/lib/elasticsearch/api/actions/ping.rb:20:in `ping'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-elasticsearch-3.7.1/lib/logstash/filters/elasticsearch.rb:270:in `test_connection!'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-elasticsearch-3.7.1/lib/logstash/filters/elasticsearch.rb:92:in `register'", "org/logstash/config/ir/compiler/AbstractFilterDelegatorExt.java:75:in `register'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:216:in `block in register_plugins'", "org/jruby/RubyArray.java:1809:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:215:in `register_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:520:in `maybe_setup_out_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:228:in `start_workers'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:170:in `run'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:125:in `block in start'"], "pipeline.sources"=>["/usr/share/logstash/pipeline/02-beats-input.conf", "/usr/share/logstash/pipeline/10-beats-filter.conf", "/usr/share/logstash/pipeline/11-gc.conf", "/usr/share/logstash/pipeline/90-elasticsearch-output.conf"], :thread=>"#<Thread:0x157c1509 run>"}
logstash_1       | [ERROR] 2020-07-28 05:43:55.965 [Converge PipelineAction::Create<main>] agent - Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create<main>, action_result: false", :backtrace=>nil}
logstash_1       | [INFO ] 2020-07-28 05:43:58.456 [[.monitoring-logstash]-pipeline-manager] elasticsearchmonitoring - New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearchMonitoring", :hosts=>["http://elasticsearch:9200"]}
logstash_1       | [WARN ] 2020-07-28 05:43:58.468 [[.monitoring-logstash]-pipeline-manager] javapipeline - 'pipeline.ordered' is enabled and is likely less efficient, consider disabling if preserving event order is not necessary
logstash_1       | [INFO ] 2020-07-28 05:43:58.811 [[.monitoring-logstash]-pipeline-manager] javapipeline - Starting pipeline {:pipeline_id=>".monitoring-logstash", "pipeline.workers"=>1, "pipeline.batch.size"=>2, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>2, "pipeline.sources"=>["monitoring pipeline"], :thread=>"#<Thread:0x5c3e0687 run>"}
logstash_1       | [INFO ] 2020-07-28 05:44:03.981 [[.monitoring-logstash]-pipeline-manager] javapipeline - Pipeline started {"pipeline.id"=>".monitoring-logstash"}
logstash_1       | [INFO ] 2020-07-28 05:44:04.790 [[main]-pipeline-manager] elasticsearch - Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://elastic:xxxxxx@elasticsearch:9200/]}}
logstash_1       | [WARN ] 2020-07-28 05:44:04.829 [[main]-pipeline-manager] elasticsearch - Restored connection to ES instance {:url=>"http://elastic:xxxxxx@elasticsearch:9200/"}
logstash_1       | [INFO ] 2020-07-28 05:44:04.852 [[main]-pipeline-manager] elasticsearch - ES Output version determined {:es_version=>7}
logstash_1       | [WARN ] 2020-07-28 05:44:04.853 [[main]-pipeline-manager] elasticsearch - Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
logstash_1       | [INFO ] 2020-07-28 05:44:05.076 [[main]-pipeline-manager] elasticsearch - New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//elasticsearch:9200"]}
logstash_1       | [INFO ] 2020-07-28 05:44:05.287 [[main]-pipeline-manager] elasticsearch - New ElasticSearch filter client {:hosts=>["pla11010:9200"]}
logstash_1       | [ERROR] 2020-07-28 05:44:05.339 [[main]-pipeline-manager] javapipeline - Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#<Elasticsearch::Transport::Transport::Errors::Unauthorized: [401] >, :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/elasticsearch-transport-5.0.5/lib/elasticsearch/transport/transport/base.rb:202:in `__raise_transport_error'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/elasticsearch-transport-5.0.5/lib/elasticsearch/transport/transport/base.rb:319:in `perform_request'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/elasticsearch-transport-5.0.5/lib/elasticsearch/transport/transport/http/manticore.rb:67:in `perform_request'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/elasticsearch-transport-5.0.5/lib/elasticsearch/transport/client.rb:131:in `perform_request'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/elasticsearch-api-5.0.5/lib/elasticsearch/api/actions/ping.rb:20:in `ping'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-elasticsearch-3.7.1/lib/logstash/filters/elasticsearch.rb:270:in `test_connection!'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-elasticsearch-3.7.1/lib/logstash/filters/elasticsearch.rb:92:in `register'", "org/logstash/config/ir/compiler/AbstractFilterDelegatorExt.java:75:in `register'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:216:in `block in register_plugins'", "org/jruby/RubyArray.java:1809:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:215:in `register_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:520:in `maybe_setup_out_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:228:in `start_workers'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:170:in `run'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:125:in `block in start'"], "pipeline.sources"=>["/usr/share/logstash/pipeline/02-beats-input.conf", "/usr/share/logstash/pipeline/10-beats-filter.conf", "/usr/share/logstash/pipeline/11-gc.conf", "/usr/share/logstash/pipeline/90-elasticsearch-output.conf"], :thread=>"#<Thread:0x599cf97d run>"}
logstash_1       | [ERROR] 2020-07-28 05:44:05.347 [Converge PipelineAction::Create<main>] agent - Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create<main>, action_result: false", :backtrace=>nil}
logstash_1       | [INFO ] 2020-07-28 05:44:08.528 [LogStash::Runner] runner - Logstash shut down.
docker-elk_logstash_1 exited with code 0

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.