No Structured Logs Found in Stack Monitoring when using Logstashs

Hello All,

Hope someone can help as I spent the past week trying to resolve this issue but couldn't.

So some backstory. If I setup Elasticsearch with filebeat pointing to Elasticsearch, the Logs section in Stack Monitoring reports the server logs and recent logs no problem.

When I go modify the filebeat to point to logstash, the logs do make it to Elasticsearch but I always have the "No Structured Logs Found" with a recommendation of pointing to the .json logs which are already setup and going to Elasticsearch.

My guess in what is happening is maybe the cluster uuid is not being sent with the server logs from logstash to Elasticsearch, where the cluster uuid would make it if going from beats straight to Elasticsearch.

I am still very knew to the whole stack and I imagine you can inject the cluster uuid in the logstash config pipeline, but I can not figure it out!

Hope someone can help!

Thank you

Welcome to our community! :smiley:

I think you'd need to share your Logstash config to provide a definitive answer, but my guess would be that the data isn't going to the indices that Monitoring is expecting.

Sure the logstash.yml

config.reload.automatic: true
node.name: logstash.com
http.host: logstash.com
http.port: 9600

xpack.monitoring.elasticsearch.hosts: ["https://logstash.com:9200"]
xpack.monitoring.elasticsearch.username: 'logstash_system'
xpack.monitoring.elasticsearch.password: 'password'
xpack.monitoring.enabled: true
xpack.monitoring.collection.interval: 10s
xpack.monitoring.elasticsearch.ssl.certificate_authority: /etc/logstash/certs/ca.crt
path.data: /var/lib/logstash
path.logs: /var/log/logstash

and my inputs is

input {
  beats {
    port => 5044
    host => "logstash.com"
    ssl => true
    ssl_key => "/etc/logstash/certs/node.key"
    ssl_certificate => "/etc/logstash/certs/node.crt"
  }
}
output {
  elasticsearch {
    hosts => ["https://logstash.com:9200"]
    index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
    cacert => "/etc/logstash/certs/ca.crt"
    user => 'elastic'
    password => 'password'
  }
}

Just to share, if I configure Beats to use Elasticsearch instead of logstash. No issues!

At this point i am giving up as I have no clue how to troubleshoot this. Hope someone comes along.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.

You may be missing the pipeline setting

input {
  beats {
    port => 5044
  }
}

output {
  if [@metadata][pipeline] {
    elasticsearch {
      hosts => "https://061ab24010a2482e9d64729fdb0fd93a.us-east-1.aws.found.io:9243"
      manage_template => false
      index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
      pipeline => "%{[@metadata][pipeline]}" 
      user => "elastic"
      password => "secret"
    }
  } else {
    elasticsearch {
      hosts => "https://061ab24010a2482e9d64729fdb0fd93a.us-east-1.aws.found.io:9243"
      manage_template => false
      index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
      user => "elastic"
      password => "secret"
    }
  }
}