How to delete index and schedule data in logstash config file

Hi I have scheduled my data in logstash config file it's working fine but the duplicate data will come in index.

so i tried to delete the index first and then update the data but it's not working that is config file is running but the index is not deleted again duplicate will come

please provide solution for how to delete the index first and update data later

I have attached my config file

input {
exec{
type => "curl"
command => 'curl -XDELETE "http://ipaddress:9200/schedule"'
codec => json
interval => 120
}
jdbc {
jdbc_connection_string => "jdbc:mysql://192.168.:3306/dbname"
jdbc_user => "
"
jdbc_password => "
*"
jdbc_validate_connection => true
jdbc_paging_enabled => "true"
jdbc_page_size => 100000
jdbc_driver_library => "D:\ELK\com.mysql.jdbc_5.1.5.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
# our query
clean_run => true
schedule => "
* * * *"
statement => "SELECT * FROM tbl_schedule"
}
}
output {
elasticsearch { hosts => ["http://ipaddress:9200"]
index => "schedule"
}
stdout { codec => rubydebug }
}

Why do you want to delete the data?

Because of duplicate will come in my index

Eg: First time I upload 10 data, after schedule it takes another 10 data it continues for every scheduling, so I want to delete the index first and upload.

Why not just use a custom _id for each document, so that the next time it gets sent to Elasticsearch it just overwrites the existing one?

1 Like

I tried to add document id in logstash config file it just fetch only one data

input {

jdbc {
jdbc_connection_string => "jdbc:mysql://ipaddress/sam_sit3"
# The user we wish to execute our statement as
jdbc_user => ""
jdbc_password => "
"
jdbc_validate_connection => true
jdbc_paging_enabled => "true"
jdbc_page_size => 100000
# The path to our downloaded jdbc driver
jdbc_driver_library => "D:\ELK\com.mysql.jdbc_5.1.5.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
# our query
clean_run => true
schedule => "
* * * *"

statement => "SELECT * FROM tbl_schedule"
}

}
output {

    elasticsearch { hosts => ["http://ipaddress:9200"] 
	document_id => "%{s.id}"
	 index => "schedule"
	}
    stdout { codec => rubydebug }

}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.