Logstash JDBC plugin question

Greetings,

Presume I have a database table containing status info about processing jobs.

This table would have the following columns with an example rule:

JobID - Jobname - RunDate - Status - StartTimestamp - EndTimestamp - Other informative columns


1 - job - 10/02/2016 - WAITING - 10:00 - 10:05 - ...

Now presume I have the logstash JDBC plugin running a query periodically to select all jobs with their statusinfo. How would an update of the job status of this row (WAITING --> SUCCEEDED) in the database table affect Elasticsearch? Would a log with a different statusupdate be inserted into Elasticsearch or would the existing log be updated?

Whichever you prefer. By default each event passing through Logstash will result in a new document in ES, but you can override that behavior by setting the elasticsearch output's document_id option to choose the id of the document to index or update. For example, if the database table's JobID column is a suitable primary key in the ES index you should be able to do this to keep the job's information updated:

elasticsearch {
  ...
  document_id => "%{JobID}"
}
1 Like