Send Filebeat Internal Monitoring Via Logstash

Basic question:

Is there a way to configure Filebeat Internal Monitoring via Logstash? I am trying to add monitoring for Filebeat running on some external systems, but can only send it via Logstash (not directly to Elasticsearch on the Monitoring cluster).

I haven't been able to track down an example of how to configure this.

Is it as easy as just settings monitoring.Elasticsearch.hosts to point at a Logstash instance instead of an Elasticsearch instance?

Hi Joseph,

Sending filebeat internal monitoring via logstash is not supported. I don't suspect pointing the monitoring ES setting at a logstash input would work.

You'd probably have a better chance of monitoring filebeat with a metricbeat process set up to send data to a beats input on the logstash instance.

However I don't recommend this since there's some extra complication from having logstash between beats and ES.

Depending on how you have logstash configured, the data could get written in a way that the stack monitoring UI doesn't know how to read.

Thanks, I ended up being able to set it up to go directly to the ES monitoring cluster.

Due to a misconfiguration while testing, I have like 280 .monitoring documents with null cluster_uuid...is there an easy way to remove those, or update that field to the actual cluster_uuid?

Great that you were able to go directly to ES, Joseph!

You can use Update By Query API | Elasticsearch Guide [7.15] | Elastic if all the documents should have the same cluster uuid, Delete by query API | Elasticsearch Guide [7.15] | Elastic if you just want to delete them, or since it's only 280 docs, you could just leave them there and let index rotation roll them off eventually.

They probably take up very little space.

Yeah, I'll probably just let them rotate out.

Thanks!

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.