After applied the tutorial "Getting start with Elasticsearch security" Logstash receive Syslog data but Kibana can’t show it

Hi buddies!

I getting a problem after applying "Getting start with Elasticsearch security" to my environment. I followed this entire procedure:

And everything works fine.

Elasticsearch protected with user and password:
image

Kibana protected with user and password:
image

An user created with all privileges:

Kibana receiving beats data and showing at Discovery view: (meatricbeats for example):

As we can see, everything works fine, except for Logstash. My logstash is configured to receive syslog data. Before applying the security settings, my environment was working fine, and I could see the data entering in Kibana at the Discovery view (logstash*- index). Now, after applying the security settings, when I go to Discovery view, and select the index logstash-*, I can't see the data:

If I run Logstash, I can see that the plugins are working fine because I still receiving syslog data without any error:

But I don't know why Kibana doesn't receive the output anymore.
Before the security setting, I could see all syslog output in Kibana.

This is my syslog.conf file:

input {
tcp {
port => 514
type => syslog
}
udp {
port => 514
type => syslog
}
}

filter {
if [type] == "syslog" {
grok {
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program} %{GREEDYDATA:syslog_message}" }
add_field => [ "received_at", "%{@timestamp}" ]
add_field => [ "received_from", "%{host}" ]
}
date {
match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
}
}

output {
elasticsearch { hosts => ["localhost:9200"] }
user => "elastic"
password => "secret"
}
stdout { codec => rubydebug }
}

Any idea of where is the problem?

I've tried to delete and create again logstash index but doesn't work.

Thanks in advance for the help.

That config doesn't have an output section.
Where's your config to send the data to Elasticsearch?

Sorry, the preformatted section didn't scroll correctly for me.

Do your elasticsearch nodes has TLS on the http port?
Specifically is xpack.security.http.ssl.enabled true in your elasticsearch.yml ?

If so, then you need to specify that here, either by setting

ssl => true

or

elasticsearch { hosts => ["https://localhost:9200"] }

Hi TimV,

I am not using SSL certificate. What I did was to follow the procedure fo the tutorial "Getting start with Elasticsearch".

I have these lines configured in my elasticsearch.yml:

according to this part of the procedure:

I've solved the problem changing my "TCP Input Plugin" configuration:

To this one: "Syslog Input Plugin":

This is my configuration file now for syslog:

input {
  syslog {
    port => 12345
    codec => cef
    syslog_field => "syslog"
    grok_pattern => "<%{POSINT:priority}>%{SYSLOGTIMESTAMP:timestamp}"
  }
}

output {
elasticsearch {
hosts => ["localhost:9200"]
user =>  "elastic"
password => "secret"
}
}

As we can see, with this plugin Kibana can receive the data from Logstash:

I still need to set some filters with Grok in my configuration file, but at moment, it's working fine.
I don't understand why TCP Input plugin doesn't work with the security configuration applied. Maybe I am doing something wrong. Apparently, I am not passing the credentials right to send the data to elasticsearch (output parameters):

image

Anycase, now it's working with "Syslog Input Plugin", but I would like to know, how to make it work with "TCP Input plugin".

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.