Multiple pipelines questions


#1

Hi,

currently we are only using the main pipeline. Processing of different types of logfiles are separated by conditionals.

I have a few questions about using multiple pipelines:

  1. Can both pipelines use the same beats input or does each pipeline needs its own?
  2. How can I decide which pipeline my log line has to use? My only idea is using different inputs, but that would need infrastructure changes like port openings, etc.
  3. What are common usecases for using multiple pipelines. In my mind I have following:
  • pipeline 1 is processing beats input, pipeline 2 is processing scheduled jdbc input. Therfore a blocking / long runnning jdbc query will not delay / block the beats pipeline
  • If I have logs of production and dev stage (of a monitored application) in the same elastic stack, than I could set pipline prod to 6 CPU and pipeline dev to 2 CPU.
  • are the above ideas correct? Any more ideas?

Thanks, Andreas


(Magnus Bäck) #2

Can both pipelines use the same beats input or does each pipeline needs its own?

They need their own inputs. The whole point of the support for multiple pipelines is that they don't share any inputs, outputs, or filters.

pipeline 1 is processing beats input, pipeline 2 is processing scheduled jdbc input. Therfore a blocking / long runnning jdbc query will not delay / block the beats pipeline

A long running JDBC query won't block the beats input since they run in separate threads anyway, but they could compete for the same filters and outputs so that could be a reason to split into multiple pipelines.

If I have logs of production and dev stage (of a monitored application) in the same elastic stack, than I could set pipline prod to 6 CPU and pipeline dev to 2 CPU.

How would you limit the CPU usage per pipeline?


#3

thought about something like this:

- pipeline.id: pipeline_dev
  path.config: "/etc/path/to/config/dev"
  pipeline.workers: 2
- pipeline.id: pipeline_prod
  path.config: "/etc/path/to/config/prod"
  pipeline.workers: 6

#4

Don't know if I get this correctly.
What do you mean by dont't share any outputs or filters.
Same output /filter is identified by it's id?

I mean, can I push events from two pipelines to the same ES? Of course each pipeline config has it's own output configuration (but both configs are identically).

When I have two pipelines which have a copy of the same configuration set, and I've given a custom id for my filter plugins (eg. http_grok). Will the filter plugins of both pipelines be independent? Hope it is clear what I am asking for :wink:


(Magnus Bäck) #5

thought about something like this:

Oh, okay. Yeah, that makes sense.


(Magnus Bäck) #6

I mean, can I push events from two pipelines to the same ES?

Of course you can.

When I have two pipelines which have a copy of the same configuration set, and I've given a custom id for my filter plugins (eg. http_grok). Will the filter plugins of both pipelines be independent

Yes.


#7

Great, many thanks. I think now I am understanding the concept behind.


(system) #8

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.