(Note… pruned a couple messages related to me giving up on this before as I’m getting back on this horse. My agent is now 8.18.8 and the integration is 1.43.0. Figured reusing this thread rather than starting another may be better.)
I had a couple things going on with this initial configuration. First off it appeared I had too many things in too many places. You really only need four things. The endpoint, the compatible bucket name, the key ID, and the key secret. I also added the region us-east-1 at some point in troubleshooting and just left it in there, though I don’t know if that matters.
When I got down to what I thought was all those, I was still having problems but they were related to having a session token set somewhere along the line. The error included the following:
Input 'aws-s3' failed with: failed to create S3 API: failed to get AWS region for bucket: operation error S3: GetBucketLocation, https response error StatusCode: 400, RequestID: , HostID: , api error InvalidArgument: X-Amz-Security-Token
Once I got the session token cleared from the config it was able to establish connection.
I had some more failures where it couldn’t find anything until I removed the prefix under the Spectrum section as there are no directory prefixes, just the time-based directories.
But, now I am getting the following error message in the log numerous times each time it attempts to load data and the only thing put in the index is metadata for the attempt rather than real records:
"message":"saving completed object state: can not executed store/set operation on closed store 'filebeat'","component":{"binary":"filebeat","dataset":"elastic_agent.filebeat","id":"aws-s3-default","type":"aws-s3"},"log":{"source":"aws-s3-default"},"log.logger":"input.aws-s3.s3","log.origin":{"file.line":189,"file.name":"awss3/s3_input.go","function":""}"github.com/elastic/beats/v7/x-pack/filebeat/input/awss3.(*s3PollerInput).workerLoop.func2"},"service.name":"filebeat","id":"aws-s3-cloudflare_logpush.spectrum_event-39fe6d60-38d7-417a-99ec-375bc0905fb1","ecs.version":"1.6.0","ecs.version":"1.6.0"}
Has anyone here ever seen Cloudflare Logpush integration sucessfully pull from Cloudflare R2 buckets? I’m starting to feel like I’m hunting a unicorn
I may have to throw in the towel and have our MSP get me a case with Elastic if I keep spinning my wheels if I don’t make any more progress today.
Thanks!