Repeat insert data use logstash-input-jdbc plugin dump data from mysql to elasticsearch

hi
i use logstash-input-jdbc plugin dump data from mysql to elasticsearch
here is my input content inlogstash.conf :

jdbc { jdbc_connection_string => "jdbc:mysql://192.168.0.49:3306/dfb" jdbc_user => "test" jdbc_password => "test" jdbc_driver_library => "/opt/logstash/mysql-connector-java-5.1.36-bin.jar" jdbc_driver_class => "com.mysql.jdbc.Driver" jdbc_paging_enabled => "true" jdbc_page_size => "50000" statement_filepath => "jdbc.sql" schedule => "* * * * *" type => "jdbc" }

the jdbc sql is
select h.id as id, h.hotel_name as name, h.photo_url as img, ha.id as haId, ha.finance_person from hotel h LEFT JOIN hotel_account ha on h.id = ha.hotel_id where h.last_modify_time >= :sql_last_start

when i setting well and first start logstash . it running well. all data is dump to es . and the console is waiting.
when i execute a insert sql to insert data the console print a message and the data also exist in es.
but about 30s later . the console print the message again . and es repeat insert the same data again too.
i test update sql . the same !**

i do not konw what is the matter with logstash . and i have konw idea where the :sql_last_start field is worked 。

Reason:
As logstash is a feature that give input for the elasticsearch and its not in-build with elasticsearch and it won't have the knowledge of what data exists in elasticsearch, blindly inserts data into elasticsearch as per the config.

Solution:
Once after completing the full index into the elasticsearch, have a intermediate table that holds the latest updates of the timestamp of table. As per the time stamp you can fire any number of incremental queries and have the scheduler in place. So that incremental query will fetch only the incremental data and scheduler will pick the latest data in a defined time.