Backup and create new data

hi community,
we are using the log-stash to import the data from oracle db everyday to elastic.

what would be the best workflow and steps to automate the job?

Current steps,each runs individually, Is there anyway to make them to run after the previous job is succeed?

  • Deleting the index through log-stash HTTP polling,
  • Importing the data to elastic using the jdbc plugin
  • taking a index backup and deleting the previous day backup using again HTTP polling

There's no way of automating this from within Logstash.

Do you really need to reindex from scratch each time? Why?

because the data changes every day.

That doesn't explain why you can't update the index incrementally, e.g. by only updating the rows that have changed (including added rows).

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.