Shipping logs from multiple log files from single instance of filebeat to logstash

Hello

I have a setup where i send only one log file like /abc/example.log to logstash instance like below:

filebeat.prospectors:

  • paths:
    • /abc/example.log
      fields:
      app_suite: cba
      application: abc

name: "xyz"

output.logstash:
template.name: "filebeat"
template.path: "filebeat.template.json"
hosts: ["ip:5044"]

======================

In the same directory abc, i also have rolled logs from example.log compressed into the zip files to manage the storage. So the zipped files will contain old logs and fresh logs will be in example.log file.

I would like to know:
If I can also ship these zipped files to logstash along with example.log file as I want old data also to be visible in elk and if it is possible, how do i do it. OR should I retain all the logs in example.log file with no rolling of logs to zipped folder?

Thanks

I'd probably just concanate the old files into one and read it as one off.

Thanks for the response. You mean to say concatenating files is the only option here?

Or i'm open to other options available here

Certainly not. I’m saying that would be the fastest way for me.

Also just curious to know what would be the other options

Why not create another directoy and extract all old logs there and then read that directory in?

Sorry for the delayed response. We have implemented the same option.

Thanks!