How to run multiple instances on one machine using the jdbc input plugin

Hi all,

I have got 6 configuration files. All of them are using jbdc plugin to read data from MySQL every minute as input and elastic search as output. Each instance write into different index, so I have got six config files and six indexes.

While I run the 6 instances one by one, they all work fine. But if I put those 6 config files into a directory, and start the logstash manually using the ‘-f’ parameter pointing to the directory, the documents are not ingested into different index properly. All the documents just messed up.

I'm using CentOS 7.2, Elastic Stack 5.5.1. And I install the logstash and elastic search by simply unzipping the tar file.

Is there any solution to this problem? I really need to monitor the changes in these 6 MySQL tables simultaneously.

Here are two of my config files:

1:

input {
  jdbc {
		jdbc_connection_string => "jdbc:mysql://*****"
		jdbc_user => "*****"
		jdbc_password => "*****"
		
		jdbc_driver_library => "/opt/elasticstack/logstash-5.5.1/lib/mysql-connector-java-5.1.42-bin.jar"
		jdbc_driver_class => "com.mysql.jdbc.Driver"

		statement_filepath => "/home/elsearch/data_migration_v0.2/sql/thread_mysql2es.sql"
		
		schedule => "* * * * *"
		last_run_metadata_path => "/home/elsearch/.logstash_jdbc_last_run_thread"

		type => "thread"
  }
}

output {
  elasticsearch {
  	hosts => [ "https://****:9200" ]
  	index => "thread"
   	document_id => "%{id}"

	user => logstash_lsj
	password => logstash_lsj123
	ssl => true
	cacert => '/opt/elasticstack/logstash-5.5.1/ca.pem'
  }
}

2:

input {
  jdbc {
		jdbc_connection_string => "jdbc:mysql://*****"
		jdbc_user => "*****"
		jdbc_password => "*****"
		
		jdbc_driver_library => "/opt/elasticstack/logstash-5.5.1/lib/mysql-connector-java-5.1.42-bin.jar"
		jdbc_driver_class => "com.mysql.jdbc.Driver"

		statement_filepath => "/home/elsearch/data_migration_v0.2/sql/topic_mysql2es.sql"
		
		schedule => "* * * * *"
		last_run_metadata_path => "/home/elsearch/.logstash_jdbc_last_run_topic"

		type => "topic"
  }
}

output {
  elasticsearch {
  	hosts => [ "https://****:9200" ]
  	index => "topic"
   	document_id => "%{id}"

	user => logstash_lsj
	password => logstash_lsj123
	ssl => true
	cacert => '/opt/elasticstack/logstash-5.5.1/ca.pem'
  }
}

So much thanks!

Until Logstash 6.0 is officially released, you will have issues with trying to load all of those pipelines in a single Logstash instance. The reason is that until then, all configurations in that directory are merged into a single pipeline. Technically, after then, they will be as well, but you can identify individual pipelines in the new pipelines.yml file.

That blog post I just linked explains the problem, the ways users have worked around it in the past, and the new way coming in Logstash 6.0.

1 Like

Thank you so much, the blog helped a lot. I will try out the "tag" way and multi instances way. I will also play with the 6.0 beta om my own laptop.

Looking forward to the general release of the 6.0 version. :grinning:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.