How to configure multiple ingest pipelines on filebeat to load data in elastic cloud

Hi,

I am running filebeat to ingest logging data from kubernetes. I am having various applications for which I have set different pipelines. How can I configure filebeat to be able to use multiple pipelines.

Thanks in advance.

So there's a bunch of ways u can do this depending on what ur trying to ingest. If you can use the filebeat modules, those pipelines are already set. If ur using custom logs, u can do conditionals or variable pipeline name on the elasticsearch output set from the log input config block.

There are mainly java apps that are using different logging facilities therefore logs are very different and has to be parsed by different pipelines.

Here is a method / suggestion

The advantage of doing it this way having a top level common pipeline that then directs and sorts to the correct specific pipeline is that you can control this outside of beat so if you change something you don't need to change the beats config.

You can also build modular pipelines so they can be reused.

You can also define a pipeline per input ... See here...

Awesome

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.