Logstash pipelines not working together, but separately yes

Hi everyone, and sorry to bother you but i can't find a proper solution to this.

I'm trying to setup a pipelines.yml file to ingest data with Logstash through 3 pipelines.
I'm running Logstash on a single node ELK stack with 16 GB RAM, 6 cores, 100GB drive.
I specified, in my pipelines.yml file, three paths to three configuration files, one for each *.conf, like this:

    -
     pipeline.id: main
     path.config: "/etc/logstash/conf.d/*.conf"
    -
     pipeline.id: mypipeline1
     path.config: "/home/myuser/configs_logstash/logstash_pipeline1.conf"
    -
     pipeline.id: mypipeline2
     path.config: "/home/myuser/configs_logstash/logstash_pipeline2.conf"
    -
     pipeline.id: mypipeline3
     path.config: "/home/myuser/configs_logstash/logstash_pipeline3.conf"

Every pipeline works perfectly if run alone, but together the 3rd one isnt executed.
I noticed also that even commenting out for example mypipeline2, and executing Logstash with this file yml, actually makes Logstash run also mypipeline2, but still no mypipeline3.
I tried also setting 1 worker for each pipeline, no results.
In every pipeline i have an stdout as output for debugging purpose, together with an Elasticsearch endpoint.

Any ideas of what i'm missing/forgetting to set?

Thank you for every help you could give me, glad to use this wonderful stack and be part of this community!

I think you may need a pipeline to route the data stream properly to the other pipelines you have defined. Have a look at https://www.elastic.co/guide/en/logstash/current/pipeline-to-pipeline.html -- there are different patterns for pipeline-to-pipeline communication. I have successfully tried the distributor pattern.

1 Like

Thank you for answer, gsk!
The thing is, the three pipelines are independent one eachother, they ingest data from 3 different JSON sources, and they go in different indices on Elasticsearch. I mean, they are not in a "waterfall" flow, but just three flows i wish to execute with the same Logstash.
Now that you linked me that, i'll try to check out something to use this pipeline-to-pipeline communication to isolate each pipeline.

P.S. Do you have any ideas of why even if i comment out the pipelines, they are executed by Logstash?

I don't know what exactly happen there, I don't see the configuration or logs. But it seems your case is similar with mine. I have two filebeats sending data to a logstash instance and then I send each stream to its own index in the cluster. You have three json sources. What I did is: I configured a pipeline with its input to listen for the incoming data, and its output routing data to the proper pipeline according to the value of a field I added in each filebeat to differentiate the streams -- you could do the same by adding a field in your json sources to differentiate your streams. This is described in the distributor pattern. After routing your streams to the proper pipeline, you can delete the added field if you don't want it to be saved in your index.

1 Like

Now it's clearer, and sounds a good solution.
Thank you for your help, gsk! I missed this pipeline-to-pipeline distributor :slight_smile:

I'm glad it worked out

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.