Logstash JDBC and Firebird

I need some help setting up logstash with jdbc and firebird 2.5 database, I'm using version 7.5 of elastic and logstash.
What happens is that I always need to overwrite the information in my indexes, ie will always have the same amount of record but always updated because my filter is by date, I'm doing this because I'm testing on a dashboard. The problem is that every time he enters the information he mixes the records already tried several configurations as the manual does not evolve below my test setup.
In summary the result of each select is mixed with the subsequent one.
*** I used google translator so I'm sorry anything in english

    input{
    	jdbc {
	    jdbc_driver_library => "C:\Users\Siamaco\Downloads\Jaybird-3.0.8-JDK_1.8\jaybird-full-3.0.8.jar"
	    jdbc_driver_class => "org.firebirdsql.jdbc.FBDriver"
	    jdbc_connection_string => "jdbc:firebirdsql:127.0.0.1/3050:C:\_Developer\_SVN\Siamaco\ATIVOS\Hypar\hotFix-4000\Projetos\exec\BASEDADOS.FDB"
	    jdbc_user => "SYSDBA"
	    jdbc_password => "masterkey"        	    
	    statement_filepath => "C:\_Developer\Elastic\SQL\ContasReceber\cheques_devolvidos.sql"
	    jdbc_paging_enabled => false
	    jdbc_page_size => 0
		use_column_value => false
#		tracking_column => "total_cheque_devolvido"		 
#		last_run_metadata_path => "C:\_Developer\Elastic\logstash-7.5.0\bin\last_run_cheques_devolvidos.txt"    
		record_last_run => false
	}
}

output{
	elasticsearch{
	  hosts => ["localhost:9200"]	  
	  index => "cheques_devolvidos"
	  document_id => "%{total_cheque_devolvido}"
	}
}	

input{
	jdbc {
    jdbc_driver_library => "C:\Users\Siamaco\Downloads\Jaybird-3.0.8-JDK_1.8\jaybird-full-3.0.8.jar"
    jdbc_driver_class => "org.firebirdsql.jdbc.FBDriver"
    jdbc_connection_string => "jdbc:firebirdsql:127.0.0.1/3050:C:\_Developer\_SVN\Siamaco\ATIVOS\Hypar\hotFix-4000\Projetos\exec\BASEDADOS.FDB"
    jdbc_user => "SYSDBA"
    jdbc_password => "masterkey"            
    statement_filepath => "C:\_Developer\Elastic\SQL\ContasReceber\ranking_clientes.sql"
    jdbc_paging_enabled => false
    jdbc_page_size => 0      
 	use_column_value => false 	 
# 	tracking_column => "nr_cliente"
# 	last_run_metadata_path => "C:\_Developer\Elastic\logstash-7.5.0\bin\last_run_ranking_clientes.txt"   
    record_last_run => false
  }

}

output{
	elasticsearch{
	  hosts => ["localhost:9200"]	  
	  index => "ranking_clientes"
	  document_id => "%{nr_cliente}"
	}
}

input{
	jdbc {
	    jdbc_driver_library => "C:\Users\Siamaco\Downloads\Jaybird-3.0.8-JDK_1.8\jaybird-full-3.0.8.jar"
	    jdbc_driver_class => "org.firebirdsql.jdbc.FBDriver"
	    jdbc_connection_string => "jdbc:firebirdsql:127.0.0.1/3050:C:\_Developer\_SVN\Siamaco\ATIVOS\Hypar\hotFix-4000\Projetos\exec\BASEDADOS.FDB"
	    jdbc_user => "SYSDBA"
	    jdbc_password => "masterkey"        	    
	    statement_filepath => "C:\_Developer\Elastic\SQL\ContasReceber\valor_todos_titulos_abertos.sql"
	    jdbc_paging_enabled => false
	    jdbc_page_size => 0
		use_column_value => false	
#		tracking_column => "total_titulos_abertos"		 
#		last_run_metadata_path => "C:\_Developer\Elastic\logstash-7.5.0\bin\last_run_valor_todos_titulos_abertos.txt" 
		record_last_run => false   
	}
}

output{
	elasticsearch{
	  hosts => ["localhost:9200"]	  
	  index => "valor_todos_titulos_abertos"
	  document_id => "%{total_titulos_abertos}"	  	  
	}
}

Any idea?

That is expected. When a pipeline is started, events are read all of the inputs, put through the filters, then written to all of the outputs.

Either, split your configuration into 3 files, and run each one in a different pipeline, or use conditionals. For example

input{
	jdbc {
    jdbc_driver_library => "C:\Users\Siamaco\Downloads\Jaybird-3.0.8-JDK_1.8\jaybird-full-3.0.8.jar"
    [...]
    statement_filepath => "C:\_Developer\Elastic\SQL\ContasReceber\cheques_devolvidos.sql"
[....]
        add_tag => [ "cheques_devolvidos" ]
}
}

output{
    if "cheques_devolvidos" in [tags] {
        elasticsearch{
            hosts => ["localhost:9200"]	  
            index => "cheques_devolvidos"
            document_id => "%{total_cheque_devolvido}"
        }
    }
}

Thanks a lot for the tip my updated code looked like this:

input{
	jdbc {
	    jdbc_driver_library => "C:\Users\Siamaco\Downloads\Jaybird-3.0.8-JDK_1.8\jaybird-full-3.0.8.jar"
	    jdbc_driver_class => "org.firebirdsql.jdbc.FBDriver"
	    jdbc_connection_string => "jdbc:firebirdsql:127.0.0.1/3050:C:\_Developer\_SVN\Siamaco\ATIVOS\Hypar\hotFix-4000\Projetos\exec\BASEDADOS.FDB"
	    jdbc_user => "SYSDBA"
	    jdbc_password => "masterkey"        	    
	    statement_filepath => "C:\_Developer\Elastic\SQL\ContasReceber\valor_todos_titulos_abertos.sql"
	    use_column_value => true
	    tracking_column => "total_titulos_abertos"
	    jdbc_paging_enabled => "true"
	    jdbc_page_size => "50000"
	    tracking_column_type => "numeric"
		last_run_metadata_path => "C:\_Developer\Elastic\logstash-7.5.0\bin\last_run_valor_todos_titulos_abertos.txt" 
		tags => [ "valor_todos_titulos_abertos" ] 			
	}
}

output{
	if "valor_todos_titulos_abertos" in [tags] {      
		elasticsearch{
	  		hosts => ["localhost:9200"]	  
	  		index => "valor_todos_titulos_abertos"
	  		document_id => "%{total_titulos_abertos}"	  	  
		}
	}	
}

A curiosity when you said running in three different files is not complex runs it as a service?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.