Running multiple S3 ingests

Hi all, first post here!

I have set up an ELK installation to ingest files using the S3 filter. As there is quite a large backlog of files sitting in the bucket, is it possible to run multiple instances of LS to clear this down? To further complicate affairs, all files are in the same bucket and have the same prefix.

If I fire up another instance of LS, is it likely to ingest the same files simultanously or is there some sort of locking mechanism?

Thanks!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.