[Logstash][Filebeat] multiple logstash pipeline with filebeat

Hello,

I installed on a Server logstash, elasticsearch and kibana.
I've got different pipelines on logstash to parse different type of logs:
example: "TPW-pipeline.conf", "weblogic-pipeline.conf", "batch-pipeline.conf" ...

at the begining I used to put manually logs in a input folder, this way everything works well, logs are parsed as I need and send to elasticsearch and kibana perfectly.

But now, I need to get logs from others servers, so I installed filebeat on an other server and followed the docs to send files to logstash. this works but I use a new pipeline "beats-pipeline.conf", so my logs' events are not send to the pipeline I need and they are not parsed.

So I would like to know if there is a way to tell to logstash to redirect the event passing through the beats-pipeline to one of the first pipeline I made. (or maybe I have to configure something in filebeat)

filebeat input:
image

filebeat output:
image

logstash beats-pipeline (for the moment i just use the shell as output:
image

Hi,

I understand your issue.
In the past we also filebeat shipping logs directly to logstash. But with only one pipeline.
So my construct there was the following:

filebeat is adding a field "logType" to the different logs. logType may be "httpd" or "tomcat" or "app_1_error".

In Logstash I had only a single input listening on beats. In the filter section I used if conditions on the log type:

if [logType] == "httpd"
{
     ... code here ...
}

etc, etc.

With current logstash I saw this here some days ago, which may help:
https://www.elastic.co/guide/en/logstash/current/pipeline-to-pipeline.html

I never used it, but I think you could mark the input with the field like above mentioned in filebeat.
Then you create a pipeline which is listening to beats port.
Based on the logType you can then forward it to another pipeline.
As I said, I just noticed this feature some days ago and haven't tried it out.

Our solution when we moved to multiple pipelines 2 years ago was to introduce redis as message broker.
Filebeat is setting the key by the logType field.

config in filebeat:

  # The name of the Redis list or channel the events are published to. The
  # default is filebeat.
  key: "%{[logType]:fallback}"

Each logstash pipeline has it's own redis-input, checking for it's relevant key. Each pipeline has its own filters and outputs.

Hope it helps,
Andreas

Hello,

pipeline to pipeline is exactly what I needed.

I didn't use it exactly as it is explained, here is what I did:

In filebeat I send everything to the port 5044 of my logstash server (nothing has changed compared to before)

Then I have my beats-pipeline.conf

then my two others pipelines (I-ve got more than two but this is enough to explain):

TPW-pipeline.conf:

weblo-pipeline.conf:

(Just one thing, the multiline codec put on your logstash won't work anymore, you will have to do it on filebeat)

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.