Multiple logstash instances for ecommerce

hi,
i want to implement ELK for an ecommerce. Project is ready on php and mysql. I want to search product using EL.
so i have to migrate mysql data to logstash index right. But there are lots of other conditions and condition based query is running.
What i need to know , for this kinda situation do i need to use multiple logstash instances ? if so what is conf file template for multiple logstash instances.
Or there are other easy way ??

Thanks .

There's nothing in the description you've given that indicates a need to run multiple Logstash instances (especially not with the multi-pipeline support in Logstash 6.0), but as you're not giving any details it's hard to tell.

I have written this code (.conf file ) for multiple statement . please tell me if i am wrong.

input {

	#first table

	jdbc {
		jdbc_driver_library => "/mysql-connector-java-5.1.44-bin.jar"
		jdbc_driver_class => "com.mysql.jdbc.Driver"
		jdbc_connection_string => "jdbc:mysql://localhost:3306/database_name"
		jdbc_user => "root"
		jdbc_password => ""
		schedule => "* * * * *"
		statement => "SELECT  @slno:=@slno+1 aut_es_1, es_qry_tbl.* FROM (SELECT * FROM `my_test_121223212`) es_qry_tbl, (SELECT @slno:=0) es_tbl"
    	type => "user"
		add_field => { "SQLDescription" => "SQLDesc1" }
		use_column_value => true
    	tracking_column => "aut_es_1"
	}

	#second table

	jdbc {
		jdbc_driver_library => "/mysql-connector-java-5.1.44-bin.jar"
		jdbc_driver_class => "com.mysql.jdbc.Driver"
		jdbc_connection_string => "jdbc:mysql://localhost:3306/database_name"
		jdbc_user => "root"
		jdbc_password => ""
		schedule => "* * * * *"
		statement => "SELECT  @slno:=@slno+1 aut_es_2, es_qry_tbl.* FROM (SELECT * FROM `my_test1212`) es_qry_tbl, (SELECT @slno:=0) es_tbl"
    	type => "user"
		add_field => { "SQLDescription" => "SQLDesc2" }
		use_column_value => true
    	tracking_column => "aut_es_2"
	}

	#third table

	jdbc {
		jdbc_driver_library => "/mysql-connector-java-5.1.44-bin.jar"
		jdbc_driver_class => "com.mysql.jdbc.Driver"
		jdbc_connection_string => "jdbc:mysql://localhost:3306/database_name"
		jdbc_user => "root"
		jdbc_password => ""
		schedule => "* * * * *"
		statement => "SELECT  @slno:=@slno+1 aut_es_3, es_qry_tbl.* FROM (SELECT * FROM `my_test`) es_qry_tbl, (SELECT @slno:=0) es_tbl"
    	type => "user"
		add_field => { "SQLDescription" => "SQLDesc3" }
		use_column_value => true
    	tracking_column => "aut_es_3"
	}
}

filter {
  mutate {
    add_field => {
      "[@metadata][document_id]" => "%{aut_es_1}%{aut_es_2}%{aut_es_3}"
    }
  }
}

output {
	#stdout {codec => rubydebug}
	elasticsearch {
		hosts => "localhost:9200"
		index => "seacrh_gobinda_ind"
		document_id => "%{[@metadata][document_id]}"
	}
}
 "[@metadata][document_id]" => "%{aut_es_1}%{aut_es_2}%{aut_es_3}"

This won't work since aut_es_X are fields in different events. Logstash won't join the events, but can't you fetch everything in a single SQL query?

no i can't , these are all dependent/conditional queries.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.