Load index to elasticsearch

Hi,
i have a problem when i load index and file.conf. When i launch the command C:\Users\Standard\Documents\kibi-4.5.3-windows64\logstash\bin\logstash
-f ordineditrasporto.conf , the execution is correct. But when i go in
the plugin GUI elasticsearch ordineditrasporto is too big in docs than
the real records in my database mysql. But the unexpected thing is that
the number of docs grow without a limit and elasticsearch continues to
process as a loop. But for 3 indexes, veicolo (240 records), vettore
(40 records) and cella(3844 records) , the result of load in elasticsearch is correct: the number
of docs is right (I end the execution of logstash when i see 40 or 240 or 3844).
This problem emerges for others indexes.

This is an example of conf file:

input{
jdbc {
jdbc_driver_library => "C:\Users\Standard\Documents\kibi-4.5.3-windows64\elasticsearch\lib\mysql-connector-java-5.1.38-bin"
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_connection_string => "jdbc:mysql://localhost:3306/optfrancesco?autoReconnect=true&useSSL=false"
jdbc_user => "root"
jdbc_password => "P1nkFloyd"
statement => "SELECT * FROM optfrancesco.pianoconsegna
schedule => "* * * * *"
jdbc_paging_enabled => "true"
jdbc_page_size => "13619"
}
}

output{
elasticsearch{
action =>"index"
index => "pianoconsegna"
document_type => "Pianoconsegna"
manage_template => false
doc_as_upsert => true
hosts => ["127.0.0.1"]
}
}

What can i do?

Thank you,

Francesco

You're using Kibi here right?

Yes i'm using kibi but i can't do nothing if the data aren't correct in
elasticsearch.

Thanks