Hi, I am working on a functionality with logstash where it reads logs from AWS MSK Kafka and sends it to S3. I use the S3 output plugin and it sends in its default format as a .txt file. My requirement was to send logs based on the below config in the output section.
time_file => 15
size_file => 488
I need a solution where I can send logs in .gz format based on time file and size file. Any pointers if it is doable in logstash ?
Reference of my output code:
s3 {
bucket => "logstash-poc"
region => "us-east-1"
prefix => "custom-logs/%{account_id}/%{log_group_flat}/"
codec => json_lines
time_file => 15
size_file => 488
canned_acl => "private"
}
stdout {
codec => rubydebug
}