Got response code '401' contacting Elasticsearch

I was working with Logstash, Elasticsearch and Kibana without security and everything was working, but I needed to put a basic authentication on Elasticsearch to go to production, and Logstash stoped to send messages to Elasticsearch.

Error:

Oct 31 17:31:15 vm-mcs-kafka logstash[1169]: [2024-10-31T17:31:15,429][WARN ][logstash.outputs.elasticsearch][main] Attempted to resurrect connection to dead ES instance, but got an error {:url=>"http://10.145.2.5:9200/", :exception=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::BadResponseCodeError, :message=>"Got response code '401' contacting Elasticsearch at URL 'http://10.145.2.5:9200/'"}
Oct 31 17:31:18 vm-mcs-kafka logstash[1169]: [2024-10-31T17:31:18,270][ERROR][logstash.outputs.elasticsearch][main][8fcfb049450aaf0d9cfe6fd056986bfea576fc70664e1a4563b8f521cac8206b] Encountered a retryable error (will retry with exponential backoff) {:code=>401, :url=>"http://10.145.2.5:9200/_bulk", :content_length=>441}

I've configured the user and password from Elasticsearch on logstash.conf

if "kafka-process-topic" in [tags] {
        elasticsearch {
            hosts => ["http://10.145.2.5:9200"]
            index => "kafka-process-topic-%{+YYYY-MM-dd}"
	        ssl => false
            ssl_certificate_verification => false
	        user => "logstash_internal"
            password => "######"
        }
  }

I tried many different forms but no one worked. I tried to use elastic user and password that I use to log in on Kibana, I tried to pass without ssl and ssl_certificate_verificaiton, I tried to pass without quotes.

I know that it is not user, password or network because if I try by curl it works:

curl http://10.145.2.5:9200/_xpack -k -u elastic:######
curl http://10.145.2.5:9200/_xpack -k -u logstash_internal:#######

All the products are in the same version: 7.17.24

Try with https
hosts => ["https://10.145.2.5:9200"]

1 Like

I find out the real problem thanks to your answer @Rios . Aftering I changed the logstash.conf file the log keeped showing HTTP, so I find out my problem was that the service restart was not working and it wasn't taking my file after change, so I did a few steps:

1-) Find the logstash process

ps -ef | grep logstash

2-) Kill the process:

sudo kill -9 <pid from prevoius command>

3-) Start the logstash service:

service logstash start

Sometimes the thread keep stuck and you can't restart by service restart. I also could remove some additional configurations from logstash.con

  if "kafka-process-topic" in [tags] {
        elasticsearch {
            hosts => ["http://10.145.2.5:9200"]
            index => "kafka-process-topic-%{+YYYY-MM-dd}"
	        user => "logstash_internal"
            password => "######"
        }
  }

Ah sorry, I didn't pay attention on this param that is set to false. You have used curl -k which is not needed for the http connection.

Be aware if you do upgrade to 8.x one day, some conf options have been changed, such as ssl => true/false is ssl_enabled etc.