[s3 Logstash] Setting bucket prefix to be specific range of days

We're pushing logs into an s3 bucket. We've setup the our indexer configuration and tested with a single folder within our bucket - which naming convention is the date (e.g. my-bucket-of-logs/20200804) however we want to be able to pull in logs from the last 14 days (14 folders)

  • /20200804, /20200803, /20200802, .......
input {
	s3 {
		bucket => "my-bucket-of-logs"
		access_key_id => "{{ access_key_id }}"
		secret_access_key => "{{ secret_key_id }}"
		prefix => "/20200804/"

From the docs => https://www.elastic.co/guide/en/logstash/current/plugins-inputs-s3.html#plugins-inputs-s3-prefix prefix is only allowed to be a string so I'm not sure if theres another way around this issue?

Any help or insight would be appreciated.

You could use 14 inputs in 14 configuration files. Once a day replace the oldest configuration file with one for today. If logstash is running with -r it will reload the pipeline when it sees one of the configuration files change. It's a hack, but it might do what you want to get done.

Thanks for the reply @Badger. Unfortunately thats not a viable solution. I think best solution in this case would be to create another s3 bucket, ingest all logs there and copy over all log files into the existing buckets daily directories.

  • push cloudflare logs to "new-logs-bucket"
  • use s3-indexer.conf to point to that bucket
  • copy over log files into my-bucket-of-logs/{{ current_date }}
  • create lifecycle policy to remove logs after N amount of days from "new-logs-bucket"

I am open to other suggestions as well.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.