Possible to run modules and pipelines on one server?

(Jon Dison) #1

I've seen several mentions to "..when 6.0 comes out you can do multiple pipelines" but can we use modules and pipelines?

I successfully enabled the netflow module but now when Logstash starts up I see this message printed and everything in conf.d seems to be ignored.
[logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified

So is it possible to use the netflow module and also setup a pipeline for syslog data? If so can you provide the details on exactly how that should be setup. I've seen several mentions that it should work but no one has a link as to how.


(Diogo Assumpcao) #2

Looks like this is not possible, but I couldn't find a definitive answer in the docs.

(Mark Walkom) #3

You should definitely be able to do this, if you're having issues similar to the original poster, then providing the logs would be helpful :slight_smile:

(Diogo Assumpcao) #4

Hi @warkolm, thanks for getting on this!

And here's my logstash.yml:

config.debug: true
log.level: debug
    name: netflow
    var.elasticsearche.hosts: "localhost:9200"
    var.input.udp.port: 2055
    var.kibana.hosts.: ""
path.data: /data/logstash
path.logs: /data/log/logstash

my pipelines.yml:

- pipeline.id: logs_and_traps
  path.config: "/etc/logstash/logs_and_traps/pipeline.config"

And my logs with debug and config debug are on this gist

(Diogo Assumpcao) #5

I've updated to 6.1.1 and still can't get the two things working. Using the logstash.yml above I can get the module working, but pipelines.yml continue to be ignored with message:

[2017-12-27T19:05:27,339][DEBUG][logstash.config.source.multilocal] Reading pipeline configurations from YAML {:location=>"/etc/logstash/pipelines.yml"}
[2017-12-27T19:05:27,353][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified

any ideias?

(Diogo Assumpcao) #6

I think this comment settles the issue:

guyboertje commented 13 days ago
I can confirm that the modules feature in Logstash is a stand alone pipeline. Using -e, -f or --modules will amount to pipelines.yaml being ignored.
The reason for this is Modules are a "getting started" quickly feature so that new Elastic Stack users can quickly build up an appreciation of the Stack without having to build Kibana objects and learn the LS config language. To do this we did not want a "toy" implementation that looks nice but has no real value.

The netflow module code is open source, this means that you can find and extract the LS config after you have ran the --setup to get the ES and Kibana artefacts installed. The config is an ERB template so there are sections that are overwritten, in between <%=, %>, with real values - replace these and you will have a netflow config.

(Michael) #7

So what I'm hearing is: Modules are quick and easy, and plug-ins/pipelines are slightly more involved. I cannot use a module for one process (Netflow) and build a pipeline for another (like JDBC). I would need to use a pipeline for the Netflow as well so that everything is up to the higher level. I'm a bit surprised, but is that correct?

BTW - The Netflow dashboards that come with the Module are nice and I was worried using the open source mapping would cause a problem and break them. For me, the JDBC connection is more important so I'll just go with the latter option.

(Jordan Sissel) #8

It's currently a by-design limitation that modules and pipelines cannot be mixed. However, we will improve this in the future to allow you to mix modules and custom pipelines.

(system) #9

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.