Problem ingest data filebeat with multi-pipeline

Hello

I have successfully configured a multi-pipeline. Each pipeline normally loads the data.
My multi-pipeline consists of three jdbc pipeline and one filebeat entry.
This morning I was facing a problem loading data via filebeat to logstash.
Filebeat copied the files to his directory and executed the transfer to logstash.
But I do not know why, logstash did not ingest this data.
I'm have this questions

  1. Is it possible for one pipeline to impact data load another during its execution?
  2. How can I be reassured that logstash will process all connector inputs?

Thank you for your suggestions

What does your configuration look like?

Hello Christian

Sorry for the late response, I was missing these days.

My pipeline.yml file is as follows:

 pipeline.id: pipeline_1
 path.config: "/etc/logstash/conf.d/pipeline_1.conf"

 pipeline.id: pipeline_2
 path.config: "/etc/logstash/conf.d/pipeline_2.conf"

 pipeline.id: pipeline_3
 path.config: "/etc/logstash/conf.d/pipeline_3.conf"

I share with you, pipeline_1 which have problem.

input {

    beats {
             port => 5044
             type => "logs"
           }

    jdbc {
         jdbc_driver_library => "/data/driverSQL/sqljdbc42.jar"
         jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
         jdbc_connection_string => 
        "jdbc:sqlserver://16.10.25.36:236;instanceName=jhhdhjjdj;databasename=database"
         jdbc_user => "User_here"
         jdbc_password => "Password_here"
         #every first sunday  of the month, at 22:00 
         schedule => "0 18 * * sun#1"
         statement => "SELECT *  FROM  [database].[dbo].[table]  where date > :sql_last_value"
         use_column_value => true
         tracking_column => "date"
         tracking_column_type => "timestamp"
         clean_run => false
         last_run_metadata_path => "/etc/logstash/.logstash_jdbc_last_run"
         type => "database"
        }

 }

filter {

 }

output {

  if [type] == "logs" {

        elasticsearch {
        hosts => ["localhost:9200"]
        user => "user_login"
        password => "password"
        index => "indexfile"
        document_type => "My_document_type"
		document_id => "%{[fields_id]}"
		}
		

        }else {

        elasticsearch {
        hosts => ["localhost:9200"]
        user => "user_login"
        password => "password"
        index => "indexdatabase"
        document_id => "%{[fields_id]}"
		}

     }

 }

This pipeline includes the input beat and one jdbc input. I have the impression that this is my problem.

I plan to separate these two entries into different pipelines, because when I comment the jdbc entry, the beat entry works fine.

I have a question about the pipeline.yml file

Is there a problem using only the main pipeline to start all the connectors and beat input?

 pipeline.id: main

path.config: ''/etc/logstash/conf.d/*.conf''

If you are only going to use a main pipeline, there is no need to use pipelines.yml as this is the default behaviour. I would however recommend splitting it up into 4 pipelines as they seem to have different combinations of inputs and outputs.

Ok, thank you.

I come back to you, to confirm is ok.

Hello Christian

All work properly with the 4 separate inputs.

Thank you

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.