How can I sync an entire dynamodb table with Elasticsearch?

Hi all, I am having trouble to connect ES to dynamodb. Today I use a python lambda (serverless) to send every dynamodb change (update, delete, modify) to the Elasticsearch clusters.
But we want to be able to reindex old dynamodb tables.
We used a workaround to trigger a bulk change in every dynamodb registry, so lambda would detect and send to ES. This workaround doesn't seem to be the best solution as it writes to production tables.
I would love any kind of suggestion, thanks in advance!

I don't know DynamoDB but can't you just run a request which selects all from the table and send that to elasticsearch instead of trying to reuse the same "watch for changes" code?

1 Like

Thanks @dadoonet, I am looking for this kind of thing. I was reading aws sdk and saw that I can make a scan operation through the whole table, that is similar to a SQL statement " SELECT * FROM table".
But I don't know if I can do this in logstash, and how to.
I use this kind of query in logstash with mysql through the jdbc connector and it works flawlessly, sent the tables and kept sync with every changes in the table. Was wondering if anyone have a more "tutorial" way of doing it (I don't have much experience dealing with elasticsearch and nosql data).

Why not using a similar lambda as the one you previously wrote?