Multiple http filters at the same time

Hi Team,

Does Logstash support multiple http filters at the same time? Please confirm.

Hi,

Can you be more specific about your configuration please ?

If you're asking if you can use the http filter serveral times in the same logstash conf the answer is yes.

Hi,

Not in same conf file. But in different conf files. Example: 3 conf files written with all having http filter and running at same time. Will three of them work at the same time if triggered?

If you have 1 pipleine then logstash will rebuild files as following their alphabetic / numeric order

B.conf :

I am the first conf file

A.conf :

I am the second conf file

Logstash will read :

I am the second conf file
I am the first conf file

I'm not sure as what do you mean by "same time" because logstash is processing as a script line per line and thus can not use severals conf file at the same time.

If you want to order them you can easily do this by using numbers like 01-first.conf, 02-second.conf ect

All three conf files have http_poller in input and are scheduled using cron job. All three are scheduled at same time.

Hello @akhilsharma.in

@grumo35 is right: the configuration files are merged into one.

  • If the data coming from the http_poller must go through only one HTTP Filter, then you can use 3 pipelines using the multi pipeline feature (and each pipeline will use one conf file).

  • If instead the data coming from the http_poller must go through all the 3 HTTP Filters, unfortunately it is not possible to run them in parallel.

But by default pipeline.workers matches the number of cores on the host.
This means N events in parallel (where N is pipeline.workers) can be processed by each HTTP Filter, but each event will execute the 3 HTTP Filters sequentially.

Also keep in mind that the bigger cost of the HTTP Filters will be the RTT (Round Trip Time, plus the serving time of the HTTP server).

If you really want the 3 HTTP filters to run at the same time, it would be possible to create a pipeline to pipeline process which makes use of:

  • "Input pipeline"
    • Takes the input data from an input filter
    • Adds a unique id to each event (e.g. using the fingerprint filter)
    • Dispatches the event to 3 different pipelines using the pipeline to pipeline feature
  • "HTTP Filter pipelines" (x 3)
    • Each one contains one of the HTTP Filters
    • Sends the result to a output pipeline
  • "Output pipeline"
    • Uses the aggregate filter to re-group the results by a unique key
    • Send the data out to destination

Thanks for all the details.!!

As all of the 3 conf files are processing REST API.

Input block generates authentication token using http_poller.
Filter block uses the token to run http filter plugin.
Output block sends data to SIEM.

Let's say we need 3 http_poller which are unique and they will send 3 tokens to 3 different filters. Then do we need to create as many pipelines? 3 for input, 3 for filters and 1 output.

Please assist.

It depends if the 3 HTTP filters are totally independent or not.
If they're independent, you can do 3 distinct pipelines using the multiple pipelines feature mentioned above, each one with 1 input, 1 filter, 1 output.

3 http_pollers are independent = 3 individual pipelines sending data.
3 http filters are independent = 3 individual pipelines accepting data from input.
Output can be independent or not. We can use tags to differentiate output streams. If that is correct?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.