Logstahs Filter

Can we apply 2 filter in 1 logstash conf file. I wanted to parse 1 log twice and send the message to multiple outputs. e-g
1 filter to remove some fields and send the message to output
2nd filter doing another check and having the fields which is removed by filter 1 and sending to multiple outputs.

Also can we use lookup table/file in logstash.?

If you want to process an event in two different ways you can use pipeline to pipeline communication with a forked path pattern.

You can do lookups using translate, jdbc_streaming, jdbc_static, memcached and possibly other filters.

Thanks for response.
Where i can save the pipeline.yml file and how it will instigate. ?

- pipeline.id: main-intake
  queue.type: persisted
  path.config: "/etc/logstash/conf.d/intake.conf"
  config.string: |
          output { pipeline { send_to => ["abc, "def"] } }
- pipeline.id: abc
  queue.type: persisted
  config.string: |
          input {
                        pipeline { address => "abc" }
                }
          filter {...}
          output {.....}
- pipeline.id: def
  queue.type: persisted
  config.string: |
          input {
                        pipeline { address => "def" }
                }
          filter {...}
          output {.....}

is there anything i need to do?

pipelines.yml (plural) should be in the directory pointed to by path.settings. On UNIX that would typically be /etc/logstash

Got it, Thanks
I can see the pipeline running now, but not sending messages to outputs.

is it default behavior? Main pipeline is not running.

[2021-11-16T01:57:34,729][INFO ][logstash.agent           ] 
Pipelines running {:count=>2, :running_pipelines=>[:abc :def], 
:non_running_pipelines=>[:"main-intake"]}

I believe this is either/or and that if both are supplied then one will be ignored. I suspect it is using config.string. That has no input, so even if it is started the pipeline will immediately shut down, leaving in the non-running bucket.

I think that you can provide both path.config and config.string and it will concatenate the configs, at least you could do that in older versions, do not know the newer ones because I do not use config.string.

But that config.string is missing a double quote.

Should be: output { pipeline { send_to => ["abc", "def"] } }

I would suggest that you move the config out of the pipelines.yml into files, just create a config file for abc and def pipelines and use path.config pointing to them, also add the output of the main-intake pipeline in the intake.conf file.

1 Like

I knew I should have tested before guessing :smiley:

If config.string comes after path.config for the same pipeline then path.config is ignored, and logstash logs where it got the configuration from:

[INFO ][logstash.javapipeline ][main] Starting pipeline {:pipeline_id=>"main", ... "pipeline.sources"=>["config string"]

If I run

- pipeline.id: main
  path.config: "/home/user/test.conf"

- pipeline.id: route
  config.string: "output { stdout {} }"

then I was surprised to find that logstash creates a stdin {} input for the second pipeline, so that it keeps running.

1 Like

Thanks @Badger , @leandrojmp
That was very helpful.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.