Filebeat modules via Logstash?

Hello!

TLDR: How do we setup and update module dashboards and pipelines without connectivity between filebeat and Kibana/ES hosts directly?

Long version:
We're using Filebeat without direct access to ES nor Kibana, everything goes via logstash.

In order to use filebeat modules this way we're installing filebeat on both the kibana host and one of the ES hosts and activates the module and set up dashboards and pipelines, this is a cumbersome approach. Also, our experience is that pipeline names are not updated in ES upon filebeat version changes.

I.e. we have set up the threatintel module with a misp source using filebeat 8.4.2. A pipeline called "filebeat-8.4.2-threatintel-misp-pipeline" was created automatically. Now when filebeat has been patched to 8.4.3 our events get rejected (400, illegal_argument_exception) due to no pipeline called filebeat-8.4.3-threatintel-misp-pipeline.

I guess i could work around this particular error by hard-coding the pipeline name in the logstash output, but if the module is updated the pipeline might be needing an update as well.

I cant imagine that the correct approach is to run filebeat setup every time after filebeat has been updated (we're autopatching our environment so we do not know when a version has been changed, until we see an error like this).

Thank you for any assistance,
Kind regards.

David

Hi @dygland

We use the same approach. We have one server sitting outside our main systems which we use to connect directly to elasticsearch & kibana to setup the dashboards. This is cumbersome and also requires the setup to be run again with each version change.

I hope someone has a better approach.

This is not possible, to setup/update the modules you will need connectivity between Filebeat and Elasticsearch/Kibana.

The main issue here is that since Filebeat modules uses ingest pipelines, Elastic expects that the output will be Elasticsearch, this is also true for the Elastic Agent that will probably replace the beats in the future.

What I would do in this case is to hard-code the ingest pipeline in logstash configuration and create some automation to check if there is a new version of the original pipeline or not, if there is a new version, update the hard-coded pipeline with it.

For example, you could create a copy of the filebeat-8.4.2-threatintel-misp-pipeline ingest pipeline with the name threatintel-misp-pipeline and use this in your logstash output, at the same time you would need some automation that would check if there is a higher version of that original pipeline, like filebeat-8.4.3-threatintel-misp-pipeline, if this version exists then you would copy its content and update the threatintel-misp-pipeline with it.

You would use the ingest REST API to get the pipelines and update it.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.