Duplicate rows/events according to number of config files

Hi,

i know this hats been discussed more than once, but no solutions seems to apply to my problem.

I have a very basic installation with logstash 5.4 and redis 3.2.3 on a test VM (CentOS 7). It is using the jdbc plugin to request all schemata from the installed MariaDB and the syslog plugin to gather syslog informations and send it directly without any filter to redis on the same server.

For this purpose i currently have 2 config files, one for the jdbc and one for the syslog plugin in the conf.d folder

Now what i see is, that all events have duplicates in redis, means e.g. that the complete set of rows from the jdbc (normally 4) has duplicates, means now there are 8 for one run. Seems that logstash creates a thread for each config executing all configs in the conf.d.

When i remove one of the config files and restart logstash the duplicates are gone.

But how can i avoid this behaviour?? I want to have a config file for each plugin for a better overview, but don't want to have multiple threads according to the number of config files and all are executing all configs. Makes no sense to me.

Any ideas how to avoid this?

Cheers
Thorsten

Found it on my own.

when using multiple config files for dedicated type of inputs and outputs in the same file, then you need to really to specify the type used in input also for the output for each file.

input{ type=> "your_input_type" } output { if [type] == "your_input_type" { ....... } }

Else ist seems, when no dedicated type is specified in the output, then the data of all input is written to this output.

Cheers Thorsten

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.