I want to collect by filebeat also specific configuration file structured as key=value lines so will be able to use it in Kibana dashboard. I managed to do it for the first time it reads the file but if the file changed by modifying value or adding line in the beginning/middle of the file it is processed incorrectly. Is there any way making filebeat processed the whole file (and only this file) from beginning every time the file was changed or to process only the modified line?
Filebeat is built based on the assumption that data is always added at the end of the file. They way it keeps state is by storing how many bytes have already been read from the file, every time there is new data, Filebeat starts reading from the last know offset.
Also there is no option for Filebeat to edit an event that has already been sent to Elasticsearch.
If you really want to read those config files you could try to play with the clean_inactive setting. This will make Filebeat remove the state of the file and re-read the whole file once it finds it again. This also means you'd have all lines being sent again to Elasticsearch.
Be aware that's a big hack and if it works, the results can be flaky.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.