Filebeat and Logstash without ingest pipelines

I am building an environment as follows:

server1 = logstash & filebeat (with Cisco module enabled)
server2 = kibana
server3 = elasticsearch

I want to use this to monitor Cisco ASA firewalls. The ASA's will send logs to Filebeat on UDP/10514. Filebeat will output to Logstash (localhost) on TCP/5044. Logstash will then send to Elasticsearch.

I have installed the Filebeat dashboards and index templates to Elasticsearch. My question is:

Will the Filebeat dashboards be functional (useful) in Elasticsearch, without loading any ingest pipelines? I understand that ingest pipelines will give additional parsing/modification capability, much like grok filtering in Logstash ... but what I'm trying to understand is whether the ingest pipelines have to be used in order for the Filebeat dashboards to work.

I understand that I could 'convert' the ingest pipelines into Logstash filtering, but this is not what I'm asking about. I want to know if the Filebeat dashboards will function without either ES ingest pipelines or equivalent Logstash filtering in place?


Short answer: no, you should not expect the dashboards provided by the cisco module to be useful if the data isn't properly processed by the ingest pipeline also provided by the same module.

So then the two follow-up questions become:

  • how to load the ingest pipeline provided by the cisco module into Elasticsearch, and
  • how to make the module's events sent to Elasticsearch via Logstash use this ingest pipeline?

Before we start answering those questions I have another question: why do you have Logstash in the mix, particularly given that it's going to run on the same host as Filebeat? Why not instead have Filebeat directly output to Elasticsearch? If there's a good reason for having Logstash in the mix then we can answer the two questions above. But if we can take Logstash out of this equation, things will "just work" if you enable the cisco module in Filebeat and use the Elasticsearch output.


Thanks. I have actually loaded the cisco ingest pipelines into Elasticsearch. The issue is that my ES is AWS-managed, with access controls allowing only AWS key-based access, so ... I don't think it's possible to authenticate directly from Filebeat. Logstash uses the 'amazon_es' output plugin which can authenticate with AWS keys. However, the 'amazon_es' output plugin does not have an option for specifying an ingest pipeline.

All quite frustrating - unless I'm missing a workaround. I have the Filebeat index template, the dashboards, and the ingest pipelines loaded in ES, but no way to reach the ingest pipeline!

I've tried running the conversion tool from the pipeline json to .conf. but it just fails to run with json errors.

Are there any possible workarounds for this?


This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.