Occasionally I get a log message that logstash can't parse. Usually it is because the filename is invalid. Sometimes it is because my elasticsearch mappings are incorrect.
I want to collect these logs so I can take action on renaming the field.
I've tried using filebeat and pointing it at /var/log/logstash_server.log. While that works, if a log file is encountered that has the wrong mapping, and it can't be uploaded to elasticsearch, it logs back to the file. Creating a loop.
How can you get logstash logs into elasticsearch without the risk of loops?