MySQL to Elasticsearch

For now, disable the elasticsearch output and insert a stdout { codec => rubydebug } output. Look in the logs. Warning messages are emitted for various kinds of error conditions. If there's nothing interesting there, increase logging verbosity by starting Logstash with --verbose or --debug. Look in the logs again. You should e.g. see every query that's sent for execution.

Its now working successfully :slightly_smiling:. My error is, the plug in was not completely updated. Thank you very much Magnus. See you on another thread.

Regards,
Croos.

How to update the existing document by schedule, when a MySQL table row is updated?
Example:

+---+------+---------+
|id | name | city    |
+---+------+---------+
| 11| croos| chennai |
| 22| nilu | colombo |
+---+------+---------+

+---+------+---------+
|id | name | city    |
+---+------+---------+
| 11| croos| trichy  |
| 22| nilu | colombo |
+---+------+---------+

In this case how to update city on elasticsearch by schedule?

I answered a very similar question in the past couple of days. Set elasticsearch's document_id option to the desired id of the document (perhaps the "id" column) instead of letting Elasticsearch allocate a random id. Then Logstash will be able to update the existing document. You may have to modify the action option too.

Would you please give me some example links..

I had an example from your previous answer. For now, this is enough to me. Thanks and catch you soon :grinning: ..!