How to make filebeat ship the missed logs of a period?

Hi! I have an index retrieving data from logstash (which gets its data from filebeat itself). There are some logs that were missed by filebeat (because filebeat was installed later on the node) and the dashboards are not showing logs of that specific period of time (before the installation of filebeat). How can I make filebeat ship those logs to logstash without losing current data and dashboards?

Run a different filebeat instance pointed at the old log files to ingest them.

1 Like

Thanks but missed logs are in the middle of the old logs and I do not know if it is okay to run multiple instances of filebeat on the same node. Is it possible to remove that specific filebeat later?

What do you mean middle of the old logs? If all the logs are in the same directory or even different directories, you can include them in the same filebeat config. It can read multiple files at once. Yes you can have multiple filebeat instances running on the same system. I'd either copy the old logs to a different directory and then run filebeat manually on that directory with the input.close_eof: true so that after it reads all the old, static files Filebeat exits. We're doing this for historical AWS logs at the same time Filebeat is ingesting the live/current events from SQS.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.