SQL output to Log File


I have a pgsql script that outputs a value from a select count(*) script to a log file.

I have configured filebeat to grab this log and push the data into Elasticsearch with a pipeline that converts the message filed to vaule type integer so I can run metrics against this. - This runs every 10 minutes

My issue comes when the file has the same value consecutively, filebeat will see this as the last entry it read and will not push the results to Elasticsearch.

I thought I would then be clever and when generating the output to delete the original output file, and create a new file with a date/time stamp in the name and set filebeat to push all /*.log files, so every time the old file is removed and a new log file added with a new timestamp in the name. Somehow filebeat saw round my trickery and pushed up the first file with the timestamp only.

How can I get filebeat to push this file every time regardless on whether this matches the last output or not?


I managed to resolved this by setting the output log file name to be just the time/date stamp rather than output-(time/date).log

I also reviewed the registry file to ensure that this wasn't building up with data as a result.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.