Got response code '401' contacting Elasticsearch at URL 7.4

Hi, i have enabled xpack and facing in issues with logstash connection
elasticsearch.yml

xpack.security.enabled: true
xpack.security.transport.ssl.enabled: true
xpack.security.audit.enabled: true

output.conf in logstash

output {
if [type] == "elblogs" {
 elasticsearch {
   hosts => ["someip:9200"]
   index => "elb-%{+YYYY.MM.dd}"
 user => "logstash_admin_user"
 password => "logstash_admin_user"
 }
}
}
[2019-10-30T12:05:54,093][WARN ][logstash.outputs.elasticsearch][main] Attempted to resurrect connection to dead ES instance, but got an error. {:url=>"http://:9200/", :error_type=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::BadResponseCodeError, :error=>"Got response code '401' contacting Elasticsearch at URL 'http://:9200/'"}

[2019-10-30T12:05:54,105][WARN ][logstash.outputs.elasticsearch][main] Attempted to resurrect connection to dead ES instance, but got an error. {:url=>"http://:9200/", :error_type=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::BadResponseCodeError, :error=>"Got response code '401' contacting Elasticsearch at URL 'http://:9200/'"}

[2019-10-30T12:05:59,097][WARN ][logstash.outputs.elasticsearch][main] Attempted to resurrect connection to dead ES instance, but got an error. {:url=>"http://:9200/", :error_type=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::BadResponseCodeError, :error=>"Got response code '401' contacting Elasticsearch at URL 'http://:9200/'"}

[2019-10-30T12:05:59,111][WARN ][logstash.outputs.elasticsearch][main] Attempted to resurrect connection to dead ES instance, but got an error. {:url=>"http://:9200/", :error_type=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::BadResponseCodeError, :error=>"Got response code '401' contacting Elasticsearch at URL 'http://:9200/'"}

i can establish connection via this user to remote ES

curl -u logstash_admin_user 'http://:9200/_xpack/security/_authenticate?pretty'
Enter host password for user 'logstash_admin_user':
{
  "username" : "logstash_admin_user",
  "roles" : [
    "superuser"
  ],
  "full_name" : "",
  "email" : "",
  "metadata" : { },
  "enabled" : true,
  "authentication_realm" : {
    "name" : "default_native",
    "type" : "native"
  },
  "lookup_realm" : {
    "name" : "default_native",
    "type" : "native"
  }
}

Hey Ivan,

You can try the below mentioned configuration, it worked for me.

##############################################################################

Logstash config file that ingests blog posts from a csv file.

elastic user is superuser

#cluster is XPACK enabled #
##############################################################################

input {
file {
path => "C:/Users/PATH_TO_YOUR_CSV/blogs.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}

filter {
dissect {
mapping => {
"message" => "%{title};%{seo_title};%{url};%{author};%{date};%{category};%{locales};%{content}"
}
}
date {
match => [ "date", "MMMM dd, yyyy" ]
target => "publish_date"
remove_field => ["date"]
}
mutate {
remove_field => ["@version", "path", "host", "message", "tags", "@timestamp"]
}
}

output {
elasticsearch {
hosts => "http://localhost:9200"
user => "elastic"
password => "elastic"
index => "blogs"
retry_on_conflict => 0
}
stdout { codec => "dots"}
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.