Logstash JDBC MySQL > Elastic... not all Records are imported

We have a MySQL Table with this kind of Datasets (10 Tables, a few hundred thousands Entries):

field_129:

{"MANDT":"200","BUKRS":"3000","ANLN1":"000000002130","ANLN2":"0194","ANLKL":"00002000","GEGST":"","ANLAR":"","ERNAM":"GASPARD","ERDAT":"1997-01-21","AENAM":"","AEDAT":"--","XLOEV":"","XSPEB":"","FELEI":"2000","KTOGR":"00020000","XOPVW":"","ANLTP":0,"ZUJHR":1997,"ZUPER":1,"ZUGDT":"1997-01-01","AKTIV":"1994-10-10","ABGDT":"0000-00-00","DEAKT":"0000-00-00","GPLAB":"0000-00-00","BSTDT":"0000-00-00","ORD41":"","ORD42":"","ORD43":"","ORD44":"","ANLUE":"","ZUAWA":"","ANEQK":0,"ANEQS":0,"LIFNR":"","LAND1":"","LIEFE":"","HERST":"","EIGKZ":"1","AIBN1":"000000002005","AIBN2":"0000","AIBDT":"1993-04-01","URJHR":0,"URWRT":"0.00 ","ANTEI":"0.00 ","PROJN":"","EAUFN":"","MEINS":"","MENGE":"0.000 ","TYPBZ":"","IZWEK":"","INKEN":"X","IVDAT":"0000-00-00","INVZU":"","VMGLI":"0000","XVRMW":"","WRTMA":"0.00 ","EHWRT":"0.00 ","AUFLA":"0000-00-00","EHWZU":"0000-00-00","EHWNR":"","GRUVO":"0000-00-00","GREIN":"0000-00-00","GRBND":"","GRBLT":"","GRLFD":"","FLURK":"","FLURN":"","FIAMT":"","STADT":"","GRUND":"","FEINS":"","GRUFL":"0.000 ","INVNR":"","VBUND":"","SPRAS":"E","TXT50":"Assembly Line KL-567/93","TXA50":"","XLTXID":"","XVERID":"","XTCHID":"","XKALID":"","XHERID":"","XLEAID":"","LEAFI":"","LVDAT":"0000-00-00","LKDAT":"0000-00-00","LEABG":"0000-00-00","LEJAR":0,"LEPER":0,"LRYTH":0,"LEGEB":"0.00 ","LBASW":"0.00 ","LKAUF":"0.00 ","LMZIN":"0.0000 ","LZINS":"0.0000 ","LTZBW":"0000-00-00","LKUZA":"0.00 ","LKUZI":"0.00 ","LLAVB":"0.00 ","LEANZ":"0 ","LVTNR":"","LETXT":"","XAKTIV":"","ANUPD":"","LBLNR":"","XV0DT":"0000-00-00","XV0NM":"","XV1DT":"1997-01-21","XV1NM":"GASPARD","XV2DT":"0000-00-00","XV2NM":"","XV3DT":"0000-00-00","XV3NM":"","XV4DT":"0000-00-00","XV4NM":"","XV5DT":"0000-00-00","XV5NM":"","XV6DT":"0000-00-00","XV6NM":"","AIMMO":0,"OBJNR":"","LEART":"","LVORS":"","GDLGRP":"","POSNR":0,"XERWRT":"","XAFABCH":"","XANLGR":"","MCOA1":"ASSEMBLY LINE KL-567/93","XINVM":"","SERNR":"","UMWKZ":"","LRVDAT":"0000-00-00","ACT_CHANGE_PM":"","HAS_TDDP":"","LAST_REORG_DATE":"0000-00-00","ANLH":{"MANDT":"200","BUKRS":"3000","ANLN1":"000000002130","LUNTN":"0194","LANEP":2,"ANUPD":"","FUNTN":"0000","ANLHTXT":""},"ANLZ":[{"MANDT":"200","BUKRS":"3000","ANLN1":"000000002130","ANLN2":"0194","BDATU":"9999-12-31","ADATU":"1900-01-01","KOSTL":"0000004270","WERKS":"3200","GSBER":"5000","LSTAR":"1420","MSFAK":"0.00 ","XSTIL":"","STORT":"1","CAUFN":"","PLAN1":"","PLAN2":"","RAUMN":"","IAUFN":"","IPROJ":"","TPLKZ":"","TPLNR":"","ANUPD":"","TXJCD":"","IPSNR":0,"KFZKZ":"","PERNR":0,"KOSTLV":"","FISTL":"","GEBER":"","FKBER":"","GRANT_NBR":"","GEBER2":"","FKBER2":"","GRANT_NBR2":"","FISTL2":"","IMKEY":"","PS_PSP_PNR2":0,"BUDGET_PD":"","BUDGET_PD2":"","SEGMENT":"","PRCTR":""}]}

field_133:

3000

field_134:

000000002130

field_135:

0194

We store it in one Index, each Table has its own _field.

The Config looks like this:

input {
        jdbc {
                jdbc_driver_library => "/home/logstash/mysql-connector-java-5.1.42/mysql-connector-java-5.1.42-bin.jar"
                jdbc_driver_class => "com.mysql.jdbc.Driver"
                jdbc_connection_string => "jdbc:mysql://xxx.xxx.xxx.69/c1_db_etltest"
                jdbc_user => "xxx"
                jdbc_password => "xxx"
                statement => "SELECT CONCAT(field_133,'-',field_134,'-',field_135) as id, field_129 as data_orig from _datasource_33"
        }
}

filter {
        mutate {
                gsub => ["data_orig","0000-00-00","1970-01-01"]
        }

        json {
                source => "data_orig"
                remove_field => ["data_orig"]
        }
}

output {
        stdout {codec => json_lines}
        #stdout {codec => json}
        elasticsearch {
                hosts => ["localhost:9200"]
                index => "etl"
                document_type => "asset"
                document_id => "%{id}"
        }
}

Most of the Datas are imported to Elastic, but on some tables they're onlly imported partially (i.e. only 700 instead of 1.400 Datasets). The concatenated IDs that are used as document_id are unique.

The Import quits with this Message:

[2017-09-20T15:27:45,927][WARN ][logstash.agent ] stopping pipeline {:id=>"main"}

What could be the Problem? With enabled debugging i can't see any errors.

There's nothing in the log prior to that warning? Have you tried bumping up the loglevel?

With logstash [...] --log.level=trace we get:

[2017-09-27T16:08:10,342][DEBUG][logstash.pipeline        ] Shutdown waiting for worker thread #<Thread:0x63f35eca>
[2017-09-27T16:08:10,342][DEBUG][logstash.pipeline        ] Shutdown waiting for worker thread #<Thread:0x6d17754d>
[2017-09-27T16:08:10,342][DEBUG][logstash.pipeline        ] Shutdown waiting for worker thread #<Thread:0x1495a57a>
[2017-09-27T16:08:10,342][DEBUG][logstash.pipeline        ] Shutdown waiting for worker thread #<Thread:0x365bea12>
[2017-09-27T16:08:10,342][DEBUG][logstash.pipeline        ] Shutdown waiting for worker thread #<Thread:0x2f8d6fbf>
[2017-09-27T16:08:10,342][DEBUG][logstash.pipeline        ] Shutdown waiting for worker thread #<Thread:0x248fe639>
[2017-09-27T16:08:10,342][DEBUG][logstash.pipeline        ] Shutdown waiting for worker thread #<Thread:0x23c11eb1>
[2017-09-27T16:08:10,342][DEBUG][logstash.filters.mutate  ] closing {:plugin=>"LogStash::Filters::Mutate"}
[2017-09-27T16:08:10,342][DEBUG][logstash.filters.json    ] closing {:plugin=>"LogStash::Filters::Json"}
[2017-09-27T16:08:10,342][DEBUG][logstash.outputs.stdout  ] closing {:plugin=>"LogStash::Outputs::Stdout"}
[2017-09-27T16:08:10,342][DEBUG][logstash.outputs.elasticsearch] closing {:plugin=>"LogStash::Outputs::ElasticSearch"}
[2017-09-27T16:08:10,342][DEBUG][logstash.outputs.elasticsearch] Stopping sniffer
[2017-09-27T16:08:10,342][DEBUG][logstash.outputs.elasticsearch] Stopping resurrectionist
[2017-09-27T16:08:11,262][DEBUG][logstash.outputs.elasticsearch] Waiting for in use manticore connections
[2017-09-27T16:08:11,263][DEBUG][logstash.outputs.elasticsearch] Closing adapter #<LogStash::Outputs::ElasticSearch::HttpClient::ManticoreAdapter:0x49026baf>
[2017-09-27T16:08:11,263][DEBUG][logstash.pipeline        ] Pipeline main has been shutdown

The Logfile looks good? What could be a solution for importing all of the Data?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.