I have the following config:
input {
jdbc { ...
staement => "select * from tab_1st"
type => "1st"}
jdbc { ...
staement => "select * from tab_2nd"
type => "2nd"}
jdbc { ...
staement => "select * from tab_3rd"
type => "3rd"}
}
filter { ...}
output { #update certain table regarding processing status
if "1st" == [type] {
jdbc {...
statement => [ "update log set status='P' where log_id=(select es_etl_log_id from es_etl_log where status='N')" ]
}
if "3rd" == [type] {
jdbc {...
statement => [ "update log set status='C' where log_id=(select es_etl_log_id from es_etl_log where status='P')" ]
}
}
I am using logstash 5.1, ES 5.1
logstash-input-jdbc (4.1.3)
logstash-output-jdbc (5.1.0)
I seems to me the logstash does not process the input in th 1st, 2nd, 3rd order. It could be 3rd....
My one file already in correct order. Combing does not change anything. Same order.
The problem is not the config order. It is how logstash process the input. So far, look like it works this way:
Get input 1,
process filter1, get input2
output1 proc filter2 get input3
output2 proc filter3
output3
See the problem?
I want them to be processed sequentially, not parallely. Since I specified 1 thread (1 worker)
I like:
Get input1
Proc filter 1
Output1
Get input 2
Proc filter 2
Output 2
Get input3
....
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.