Logstash issue after implementing xpack

Hello Team,

We are using ELK6.4.4 and everything is working fine.

But now we are trying ELK7.1 for RBAC and TLS feature. We have implement xpack security in elasticserach.

We have added the below 2 line in elasticsearch.yml file:

xpack.security.enabled: true
xpack.security.transport.ssl.enabled: true

And set the password for all built in user. Now when we are trying to query our elasticsearch we need to pass builtin user and password then its working. As shown below:

curl -XGET -u elastic:admin@123 http://192.168.56.4:9200/

We have added the below 2 line in our kibana.yml file:

elasticsearch.username: "kibana"
elasticsearch.password: "admin@123"

And now we are able to access the kibana dashboard using username elastic and its password. We are able to see security feature on dashboard and able to create new user also using users

Selection_034

We are able to see the old logs also on kibana dashboard.

But now we are facing issue that new logs are not reaching over kibana dashboard. Its seems that logstash is not sending logs to elasticsearch. We have made the below changes at logstash:

Added the below 2 lines in logstash.yml

xpack.monitoring.elasticsearch.username: "logstash_internal"
xpack.monitoring.elasticsearch.password: "admin@123"

Below is my logstash configuration file

input {
  beats {
    port => 5044
  }
}

filter {
  if [type] == "syslog" {
    grok {
      match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}" }
      add_field => [ "received_at", "%{@timestamp}" ]
      add_field => [ "received_from", "%{host}" ]
    }
    syslog_pri { }
    date {
      match => [ "syslog_timestamp", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]
    }
  }
}


output {
  elasticsearch {
    hosts => ["192.168.56.4:9200"]
    user => "logstash_internal"
    password => "admin@123"
    sniffing => true
    manage_template => false
    index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
    document_type => "%{[@metadata][type]}"
  }
}

When i am restarting the logstash service getting below error log:

[2019-06-11T17:52:37,338][WARN ][logstash.outputs.elasticsearch] Attempted to resurrect connection to dead ES instance, but got an error. {:url=>"http://logstash_internal:xxxxxx@192.168.56.4:9200/", :error_type=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::BadResponseCodeError, :error=>"Got response code '401' contacting Elasticsearch at URL 'http://192.168.56.4:9200/'"}
[2019-06-11T17:52:38,307][WARN ][logstash.outputs.elasticsearch] Error while performing sniffing {:error_message=>"Got response code '401' contacting Elasticsearch at URL 'http://192.168.56.4:9200/_nodes/http'", :class=>"LogStash::Outputs::ElasticSearch::HttpClient::Pool::BadResponseCodeError", :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client/manticore_adapter.rb:80:in `perform_request'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:291:in `perform_request_to_url'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:278:in `block in perform_request'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:373:in `with_connection'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:277:in `perform_request'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:164:in `check_sniff'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:157:in `sniff!'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:146:in `block in start_sniffer'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:128:in `until_stopped'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:144:in `block in start_sniffer'"]}

Can you please help me whats mistake i am doing here?

Thanks.

A 401 error is username/password failed.

The first step is to check that you can authenticate directly to Elasticsearch using those credentials.
e.g.

curl -u logstash_internal:admin@123 "http://192.168.56.4:9200/_xpack/security/_authenticate"

@TimV, Thank you for your response.

The above issue is fixed. Earlier i didn't create logstash_writer role and logstash_internal user on kibana dashboard. Once i created those logs are started pouring on kibana dashboard again.

I have followed the below document:

document

Thanks.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.