When multiple pipeline is configured for logstash which is running on docker, data transfer is not happened

When multiple pipeline is configured for logstash running on docker with custom image then logstash pipeline gets started successfully, data transfer is not happened.

Execution Steps:
    1. Add below parameters in pipelines.yml
    - pipeline.id: in-take
      config.string: |
       input { file { path => "/var/log/sample.log" } }
       output { pipeline { send_to => ["file"] } }
    - pipeline.id: out-take
      config.string: |
       input { pipeline { address => "file" } }
       output { stdout {} }
    2. Create a "Dockerfile" inside a directory "dockerLogstash"
    # vim Dockerfile
    FROM docker.elastic.co/logstash/logstash:6.4.1
    RUN rm -rf /usr/share/logstash/config/pipelines.yml
    RUN rm -rf /usr/share/logstash/config/logstash.yml
    RUN rm -rf /usr/share/logstash/pipeline/logstash.conf
    ADD config/pipelines.yml /usr/share/logstash/config/
    ADD config/logstash.yml /usr/share/logstash/config/
    ADD sample.log /var/log/
    USER root
    3. Add attached configuration file(s) (logstash.yml, pipelines.yml) inside the same directory created in step-2
    4. Build logstash custom docker image
    # docker build -t <image-name> .   
    5. Check docker images created in step-4.
    # docker images
    6. Run Logstash docker custom image
    # docker run <image-id>

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.