Why can't my logStash account password connect to ES?

I cannot connect to my ES in my docker environment. My account and password are configured correctly, but LogStash still cannot be used!

Curl Result:

logstash@0ef097e0f:~$ curl -u elastic:7z_xxxxxxx http://elasticsearch:9200
{
  "name" : "255cd0e100f0",
  "cluster_name" : "elasticsearch",
  "cluster_uuid" : "EVDdihBUR76RLdIQB7VVOw",
  "version" : {
    "number" : "8.17.0",
    "build_flavor" : "default",
    "build_type" : "docker",
    "build_hash" : "2b6a7fed44faa321997703718f07ee0420804b41",
    "build_date" : "2024-12-11T12:08:05.663969764Z",
    "build_snapshot" : false,
    "lucene_version" : "9.12.0",
    "minimum_wire_compatibility_version" : "7.17.0",
    "minimum_index_compatibility_version" : "7.0.0"
  },
  "tagline" : "You Know, for Search"
}
logstash@0ef097e0f:~$

logstash.conf configuration:

 elasticsearch {
      hosts => ["http://elasticsearch:9200"]
      index => "erpadmin-logs-%{+YYYY.MM.dd}"
      user => "elastic"
      password => "7z_xxxxxxx"
      ssl_verification_mode => "none"
    }

elasticsearch.yml configuration:

cluster.name: "docker-cluster"
network.host: 0.0.0.0
xpack.security.enabled: true

Logstash logs:

configuration. Please explicitly set `xpack.monitoring.enabled: true` in logstash.yml
[2025-01-16T08:56:40,947][WARN ][deprecation.logstash.monitoringextension.pipelineregisterhook] Internal collectors option for Logstash monitoring is deprecated and targeted for removal in the next major version.
Please configure Elastic Agent to monitor Logstash. Documentation can be found at:
https://www.elastic.co/guide/en/logstash/current/monitoring-with-elastic-agent.html
[2025-01-16T08:56:41,083][INFO ][logstash.licensechecker.licensereader] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://elasticsearch:9200/]}}
[2025-01-16T08:56:41,108][WARN ][logstash.licensechecker.licensereader] Health check failed {:code=>401, :url=>http://elasticsearch:9200/, :message=>"Got response code '401' contacting Elasticsearch at URL 'http://elasticsearch:9200/'"}
[2025-01-16T08:56:41,112][WARN ][logstash.licensechecker.licensereader] Elasticsearch main endpoint returns 401 {:message=>"Got response code '401' contacting Elasticsearch at URL 'http://elasticsearch:9200/'", :body=>"{\"error\":{\"root_cause\":[{\"type\":\"security_exception\",\"reason\":\"missing authentication credentials for REST request [/]\",\"header\":{\"WWW-Authenticate\":[\"Basic realm=\\\"security\\\", charset=\\\"UTF-8\\\"\",\"ApiKey\"]}}],\"type\":\"security_exception\",\"reason\":\"missing authentication credentials for REST request [/]\",\"header\":{\"WWW-Authenticate\":[\"Basic realm=\\\"security\\\", charset=\\\"UTF-8\\\"\",\"ApiKey\"]}},\"status\":401}"}
[2025-01-16T08:56:41,112][ERROR][logstash.licensechecker.licensereader] Unable to retrieve Elasticsearch cluster info. {:message=>"Could not read Elasticsearch. Please check the credentials", :exception=>LogStash::ConfigurationError}
[2025-01-16T08:56:41,138][INFO ][logstash.licensechecker.licensereader] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://elasticsearch:9200/]}}
[2025-01-16T08:56:41,147][WARN ][logstash.licensechecker.licensereader] Health check failed {:code=>401, :url=>http://elasticsearch:9200/, :message=>"Got response code '401' contacting Elasticsearch at URL 'http://elasticsearch:9200/'"}
[2025-01-16T08:56:41,150][WARN ][logstash.licensechecker.licensereader] Elasticsearch main endpoint returns 401 {:message=>"Got response code '401' contacting Elasticsearch at URL 'http://elasticsearch:9200/'", :body=>"{\"error\":{\"root_cause\":[{\"type\":\"security_exception\",\"reason\":\"missing authentication credentials for REST request [/]\",\"header\":{\"WWW-Authenticate\":[\"Basic realm=\\\"security\\\", charset=\\\"UTF-8\\\"\",\"ApiKey\"]}}],\"type\":\"security_exception\",\"reason\":\"missing authentication credentials for REST request [/]\",\"header\":{\"WWW-Authenticate\":[\"Basic realm=\\\"security\\\", charset=\\\"UTF-8\\\"\",\"ApiKey\"]}},\"status\":401}"}
[2025-01-16T08:56:41,150][ERROR][logstash.licensechecker.licensereader] Unable to retrieve license information from license server {:message=>"Could not read Elasticsearch. Please check the credentials"}
[2025-01-16T08:56:41,157][ERROR][logstash.monitoring.internalpipelinesource] Failed to fetch X-Pack information from Elasticsearch. This is likely due to failure to reach a live Elasticsearch cluster.

Let's check a couple of things together.

  • Try to push document into elasticsearch manually for testing.
curl -XPOST "https://localhost:9200/_bulk" -H "Content-Type: application/json" -d'
{ "index": { "_index": "my_test_index", "_id": 1 } }
{ "field": "foo" }
'
  • enable debug log for logstash and check the logs
bin/logstash --debug -f your_logstash_conf_file.conf

Thank you for your help, but the problem has not been resolved. I tried adding the parameter

ssl_verification_mode => "none"

, but the issue still persists. I believe it may be related to the x-park configuration.

My setup and logs:

Curl Result:

logstash@0ef097e0f:~$ curl -u elastic:7z_xxxxxxx http://elasticsearch:9200
{
  "name" : "255cd0e100f0",
  "cluster_name" : "elasticsearch",
  "cluster_uuid" : "EVDdihBUR76RLdIQB7VVOw",
  "version" : {
    "number" : "8.17.0",
    "build_flavor" : "default",
    "build_type" : "docker",
    "build_hash" : "2b6a7fed44faa321997703718f07ee0420804b41",
    "build_date" : "2024-12-11T12:08:05.663969764Z",
    "build_snapshot" : false,
    "lucene_version" : "9.12.0",
    "minimum_wire_compatibility_version" : "7.17.0",
    "minimum_index_compatibility_version" : "7.0.0"
  },
  "tagline" : "You Know, for Search"
}
logstash@0ef097e0f:~$

Environment: Docker container deployment

logstash.conf configuration:

 elasticsearch {
      hosts => ["http://elasticsearch:9200"]
      index => "erpadmin-logs-%{+YYYY.MM.dd}"
      user => "elastic"
      password => "7z_xxxxxxx"
      ssl_verification_mode => "none"
    }

elasticsearch.yml configuration:

cluster.name: "docker-cluster"
network.host: 0.0.0.0
xpack.security.enabled: true

Logstash logs:

configuration. Please explicitly set `xpack.monitoring.enabled: true` in logstash.yml
[2025-01-16T08:56:40,947][WARN ][deprecation.logstash.monitoringextension.pipelineregisterhook] Internal collectors option for Logstash monitoring is deprecated and targeted for removal in the next major version.
Please configure Elastic Agent to monitor Logstash. Documentation can be found at:
https://www.elastic.co/guide/en/logstash/current/monitoring-with-elastic-agent.html
[2025-01-16T08:56:41,083][INFO ][logstash.licensechecker.licensereader] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://elasticsearch:9200/]}}
[2025-01-16T08:56:41,108][WARN ][logstash.licensechecker.licensereader] Health check failed {:code=>401, :url=>http://elasticsearch:9200/, :message=>"Got response code '401' contacting Elasticsearch at URL 'http://elasticsearch:9200/'"}
[2025-01-16T08:56:41,112][WARN ][logstash.licensechecker.licensereader] Elasticsearch main endpoint returns 401 {:message=>"Got response code '401' contacting Elasticsearch at URL 'http://elasticsearch:9200/'", :body=>"{\"error\":{\"root_cause\":[{\"type\":\"security_exception\",\"reason\":\"missing authentication credentials for REST request [/]\",\"header\":{\"WWW-Authenticate\":[\"Basic realm=\\\"security\\\", charset=\\\"UTF-8\\\"\",\"ApiKey\"]}}],\"type\":\"security_exception\",\"reason\":\"missing authentication credentials for REST request [/]\",\"header\":{\"WWW-Authenticate\":[\"Basic realm=\\\"security\\\", charset=\\\"UTF-8\\\"\",\"ApiKey\"]}},\"status\":401}"}
[2025-01-16T08:56:41,112][ERROR][logstash.licensechecker.licensereader] Unable to retrieve Elasticsearch cluster info. {:message=>"Could not read Elasticsearch. Please check the credentials", :exception=>LogStash::ConfigurationError}
[2025-01-16T08:56:41,138][INFO ][logstash.licensechecker.licensereader] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://elasticsearch:9200/]}}
[2025-01-16T08:56:41,147][WARN ][logstash.licensechecker.licensereader] Health check failed {:code=>401, :url=>http://elasticsearch:9200/, :message=>"Got response code '401' contacting Elasticsearch at URL 'http://elasticsearch:9200/'"}
[2025-01-16T08:56:41,150][WARN ][logstash.licensechecker.licensereader] Elasticsearch main endpoint returns 401 {:message=>"Got response code '401' contacting Elasticsearch at URL 'http://elasticsearch:9200/'", :body=>"{\"error\":{\"root_cause\":[{\"type\":\"security_exception\",\"reason\":\"missing authentication credentials for REST request [/]\",\"header\":{\"WWW-Authenticate\":[\"Basic realm=\\\"security\\\", charset=\\\"UTF-8\\\"\",\"ApiKey\"]}}],\"type\":\"security_exception\",\"reason\":\"missing authentication credentials for REST request [/]\",\"header\":{\"WWW-Authenticate\":[\"Basic realm=\\\"security\\\", charset=\\\"UTF-8\\\"\",\"ApiKey\"]}},\"status\":401}"}
[2025-01-16T08:56:41,150][ERROR][logstash.licensechecker.licensereader] Unable to retrieve license information from license server {:message=>"Could not read Elasticsearch. Please check the credentials"}
[2025-01-16T08:56:41,157][ERROR][logstash.monitoring.internalpipelinesource] Failed to fetch X-Pack information from Elasticsearch. This is likely due to failure to reach a live Elasticsearch cluster.

Hi @Jerry_Williams Welcome to the community!

Exactly How are you starting logstash?

Command line or systemctl?

If systemctl

Share your pipeline.yml

A Common problem is there more than .conf file that gets concatenated ... Then your elasticsearch settings can get written over

Do you have more than .conf in

/etc/logstash/conf.d/*.conf

Thank You,@stephenb

I am deploying and running it inside a Docker container.

docker-compose stop logstash
docker-compose rm -f logstash
docker-compose up -d logstash

Inside a docker container

[root@centos-81 elk]# docker ps
CONTAINER ID   IMAGE                           COMMAND                  CREATED        STATUS        PORTS                                                                                  NAMES
0ef097e0f7d6   logstash:8.17.0                 "/usr/local/bin/dock…"   24 hours ago   Up 2 hours    0.0.0.0:5044->5044/tcp, :::5044->5044/tcp, 9600/tcp                                    logstash
255cd0e100f0   elasticsearch:8.17.0            "/bin/tini -- /usr/l…"   3 weeks ago    Up 26 hours   0.0.0.0:9200->9200/tcp, :::9200->9200/tcp, 0.0.0.0:9300->9300/tcp, :::9300->9300/tcp   elasticsearch
0de9cfc1ca43   kibana:8.17.0                   "/bin/tini -- /usr/l…"   3 weeks ago    Up 26 hours   0.0.0.0:5601->5601/tcp, :::5601->5601/tcp                                              kibana

My pipelines:

logstash@0ef097e0f7d6:~/config$ cat logstash.yml
http.host: "0.0.0.0"
xpack.monitoring.elasticsearch.hosts: [ "http://elasticsearch:9200" ]
logstash@0ef097e0f7d6:~/config$ cat pipelines.yml
# This file is where you define your pipelines. You can define multiple.
# For more information on multiple pipelines, see the documentation:
#   https://www.elastic.co/guide/en/logstash/current/multiple-pipelines.html
- pipeline.id: main
  path.config: "/usr/share/logstash/pipeline"
logstash@0ef097e0f7d6:~/config$ cd ../pipeline/
logstash@0ef097e0f7d6:~/pipeline$ pwd
/usr/share/logstash/pipeline
logstash@0ef097e0f7d6:~/pipeline$ ls
logstash.conf
logstash@0ef097e0f7d6:~/pipeline$ cat logstash.conf
input {
  redis {
    host => "192.168.106.229"
    port => 8020
    db => 1
    data_type => "list"
    key => "defalut-service"
    add_field => { "channel" => "defalut-service" }
  }
}

filter {
  if [source] == "defalut-service" {
    mutate {
      add_field => { "target_index" => "defalut-service-logs-%{+YYYY.MM.dd}" }
    }
  }
}


output {
  if [target_index] {
    elasticsearch {
      hosts => [ "http://elasticsearch:9200" ]
      index => "%{target_index}"
      user => "elastic"
      password => "7z_XXXXXXXXXXX"
      ssl_verification_mode => "none"
    }
  }
  stdout { codec => rubydebug }
}

My docker-compse.yml

services:
  elasticsearch:
    image: elasticsearch:8.17.0
    container_name: elasticsearch
    environment:
      - cluster.name=elasticsearch
      - discovery.type=single-node
      - ES_JAVA_OPTS=-Xms512m -Xmx512m
    volumes:
      - /home/docker-compose/elk/elasticsearch/plugins:/usr/share/elasticsearch/plugins
      - /home/docker-compose/elk/elasticsearch/data:/usr/share/elasticsearch/data
      - /home/docker-compose/elk/elasticsearch/elasticsearch.yml:/usr/share/elasticsearch/config/elasticsearch.yml
    ports:
      - 9200:9200
      - 9300:9300
    networks:
      - elk-net  

  logstash:
    image: logstash:8.17.0
    container_name: logstash
    ports:
      - 5044:5044
    environment:
      - TZ=Asia/Shanghai
    volumes:
      - /home/docker-compose/elk/logstash/logstash.conf:/usr/share/logstash/pipeline/logstash.conf
    depends_on:
      - elasticsearch
    networks:
      - elk-net
  kibana:
    image: kibana:8.17.0
    container_name: kibana
    ports:
      - 5601:5601
    environment:
      - ELASTICSEARCH_HOSTS=http://elasticsearch:9200  
      - i18n.locale=zh-CN
    volumes:
      - /home/docker-compose/elk/kibana/config/kibana.yml:/usr/share/kibana/config/kibana.yml
    depends_on:
      - elasticsearch
    networks:
      - elk-net
networks:
  elk-net:
    driver: bridge

Perhaps you are not running the pipeline you think,

Perhaps try

volumes:
      - /home/docker-compose/elk/logstash/:/usr/share/logstash/pipeline/
   

Make sure you only have 1 .conf in

/home/docker-compose/elk/logstash/

Also easy test comment out the elasticsearch output and just do the stdout output.

@stephenb Even though I changed the mount directory to make sure there was only one conf file under the directory, and I changed the.conf file to temporarily remove the es output, 401 still appears.

input {
  redis {
    host => "192.168.106.229"
    port => 8020
    db => 1
    data_type => "list"
    key => "defalut-service"
    add_field => { "channel" => "defalut-service" }
  }
}

filter {
  if [source] == "defalut-service" {
    mutate {
      add_field => { "target_index" => "defalut-service-logs-%{+YYYY.MM.dd}" }
    }
  }
}

output {
  stdout { codec => rubydebug }
}

logstash log:

[2025-01-16T11:29:02,750][WARN ][logstash.licensechecker.licensereader] Health check failed {:code=>401, :url=>http://elasticsearch:9200/, :message=>"Got response code '401' contacting Elasticsearch at URL 'http://elasticsearch:9200/'"}
[2025-01-16T11:29:02,754][WARN ][logstash.licensechecker.licensereader] Elasticsearch main endpoint returns 401 {:message=>"Got response code '401' contacting Elasticsearch at URL 'http://elasticsearch:9200/'", :body=>"{\"error\":{\"root_cause\":[{\"type\":\"security_exception\",\"reason\":\"missing authentication credentials for REST request [/]\",\"header\":{\"WWW-Authenticate\":[\"Basic realm=\\\"security\\\", charset=\\\"UTF-8\\\"\",\"ApiKey\"]}}],\"type\":\"security_exception\",\"reason\":\"missing authentication credentials for REST request [/]\",\"header\":{\"WWW-Authenticate\":[\"Basic realm=\\\"security\\\", charset=\\\"UTF-8\\\"\",\"ApiKey\"]}},\"status\":401}"}
[2025-01-16T11:29:02,754][ERROR][logstash.licensechecker.licensereader] Unable to retrieve Elasticsearch cluster info. {:message=>"Could not read Elasticsearch. Please check the credentials", :exception=>LogStash::ConfigurationError}
[2025-01-16T11:29:02,783][INFO ][logstash.licensechecker.licensereader] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://elasticsearch:9200/]}}
[2025-01-16T11:29:02,790][WARN ][logstash.licensechecker.licensereader] Health check failed {:code=>401, :url=>http://elasticsearch:9200/, :message=>"Got response code '401' contacting Elasticsearch at URL 'http://elasticsearch:9200/'"}
[2025-01-16T11:29:02,792][WARN ][logstash.licensechecker.licensereader] Elasticsearch main endpoint returns 401 {:message=>"Got response code '401' contacting Elasticsearch at URL 'http://elasticsearch:9200/'", :body=>"{\"error\":{\"root_cause\":[{\"type\":\"security_exception\",\"reason\":\"missing authentication credentials for REST request [/]\",\"header\":{\"WWW-Authenticate\":[\"Basic realm=\\\"security\\\", charset=\\\"UTF-8\\\"\",\"ApiKey\"]}}],\"type\":\"security_exception\",\"reason\":\"missing authentication credentials for REST request [/]\",\"header\":{\"WWW-Authenticate\":[\"Basic realm=\\\"security\\\", charset=\\\"UTF-8\\\"\",\"ApiKey\"]}},\"status\":401}"}
[2025-01-16T11:29:02,817][ERROR][logstash.licensechecker.licensereader] Unable to retrieve license information from license server {:message=>"Could not read Elasticsearch. Please check the credentials"}
[2025-01-16T11:29:02,823][ERROR][logstash.monitoring.internalpipelinesource] Failed to fetch X-Pack information from Elasticsearch. This is likely due to failure to reach a live Elasticsearch cluster.

logstash -- docker-compse.yml

  logstash:
    image: logstash:8.17.0
    container_name: logstash
    ports:
      - 5044:5044
    environment:
      - TZ=Asia/Shanghai
    volumes:
      - /home/docker-compose/elk/logstash/:/usr/share/logstash/pipeline/
    depends_on:
      - elasticsearch
    networks:
      - elk-net

@stephenb If I turn off the security features of x-park, it can be used normally, but the network environment does not allow this

This is telling...
This means you are not running the conf file you think you are... Because it should not be connecting to elasticsearch at all

But you can see it is trying to connect to

http://elasticsearch:9200/

I suspect it is running the default .conf in the docker container.... Or something else

exec in to logstash container I bet you will see the default .conf there and that is what is being run or something else...

Something basic... Your directory or file permissions etc wrong file something

Bind-mounted configuration files will retain the same permissions and ownership within the container that they have on the host system. Be sure to set permissions such that the files will be readable and, ideally, not writeable by the container’s logstash user (UID 1000

1 Like

@stephenb Thank you very much for your patient explanation. I found the problem. I discovered that when Logstash is started via docker-compose.yml, there is the following configuration in the logstash.yml file:

xpack.monitoring.elasticsearch.hosts: [ "http://elasticsearch:9200" ]

Since no Elasticsearch username and password were provided for this configuration, a 401 error kept occurring.

I mounted the file to the host, then either removed this configuration or added the username and password. Since I currently don't need this feature, I chose to remove it.

Many thanks to the community for the support and help, and special thanks to Stephen Brown.

1 Like

@Jerry_Williams That is a good Find! something I will try to remember for the next time

Thank you very much. If you hadn't mentioned commenting out the es connection in the output, I can't see where else logstash would connect to ES

If someone did not understand, then to solve this problem, you need to provide logstash with credentials for xpack.monitoring.elasticsearch in docker-compose.

Example:

  logstash:
    image: logstash:8.12.2
    build: ./logstash
    ports:
      - "5000:5000"
    volumes:
      - ./logstash/logstash.conf:/usr/share/logstash/pipeline/logstash.conf
    environment:
      - LS_JAVA_OPTS=-Xmx256m -Xms256m
      - ELASTICSEARCH_USERNAME=${ELK_LOGSTASH_USER}
      - ELASTICSEARCH_PASSWORD=${ELK_LOGSTASH_PASSWORD}
      - xpack.monitoring.enabled=true
      - xpack.monitoring.elasticsearch.username=${ELK_LOGSTASH_USER}
      - xpack.monitoring.elasticsearch.password=${ELK_LOGSTASH_PASSWORD}
    depends_on:
      - elasticsearch
    networks:
      - elk
1 Like