i am using logistash to transfer data from MYSQL [wamp] to Elasticsearch.
the query i am using is extremely complex will lots of joins and multiple values in select statement
something like this
select CONVERT_TZ(so.purchased_at, 'UTC', m.timezone) AS purchased_at, CONVERT_TZ(so.last_updated_at, 'UTC', m.timezone) AS last_updated_at, ...... IF(p.price is not null, p.price * soi.quantity_ordered, 0))) >= IF(p.discount_threshold, p.discount_threshold, IF(u.default_discount_threshold, u.default_discount_threshold, 0.5)) , 1, 0) AS promo from `sale_order_items` as `soi` inner join `sale_orders` as `so` on `soi`.`sale_order_id` = `so`.`id` inner join `products` as `p` on `soi`.`product_id` = `p`.`id` inner join `users` as `u` on `so`.`user_id` = `u`.`id` inner join `marketplaces` as `m` on `u`.`marketplace_id` = `m`.`id` where `u`.`id` in (2) and ((`u`.`id` = 2)) order by CONVERT_TZ(so.purchased_at, 'UTC', m.timezone) desc
when i ran it using bin\logistash -f db.conf
logistash pipeline is ended without any error and when i view the newly created index in kibana i can see many documents deleted for that particular index .. i dont know why documents are deleted.
yellow open orders fbAcP-KRTy6lpKZUA 5 1 123 6790 156.7mb 156.7mb
deleted documents : 6790
this is my db.conf code
input {
jdbc {
jdbc_connection_string => "jdbc:mysql://localhost:3306/db"
jdbc_user => "user"
jdbc_password => "password"
jdbc_driver_library => "D:\ELK\logstash-5.6.2\mysql-connector-java-5.1.44-bin.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_fetch_size => 100
statement => "select * from sale_order_items"
}
}
output {
elasticsearch {
index => "orders"
document_type => "order"
document_id => "%{id}"
hosts => "localhost:9200"
}
}
Thank You ..