I just upgraded logstash version to 2.3.2. My config file worked fine on the previous version I had (2.3.1 i think) but since I upgraded, it seems like the log files I specified for the file plugin aren't read. It works just fine with the stdin plugin though.
I have the following configuration for the file plugin:
file {
path => ["/home/osadmin/logs/com/.log" , "/home/osadmin/logs/ofr/.log"]
sincedb_path => "/home/osadmin/logstash/sincebd_com_ofr"
start_position => "beginning"
}
As an output, I have the elasticsearch plugin and stdout but nothing goes on my elasticsearch node and nothing is printed on the standard output for stdout, just the launching messages:
Settings: Default pipeline workers: 2
Pipeline main started
There is actually no sincedb file created. It like the file plugin could'nt find the files I specified in "path". I checked ans double-checked and the file are exactly where they are supposed to be and at the exact same path than specified in the file plugin.
After a few manipulations, it seems like Logsatsh doesn't read files except if I execute a touch on the files I want to be read. My configuration is the following one:
I tried by removing the sincedb file but even then, I have to execute the touch command on the files have to process so they can be taken into consideration for processing.
Also, when I add files into the folder where the files to read are, Logstash doesn't seem to see them and don't process them.
That explains why the touch command would cause the files to be picked up. Because the access time and modify time must be from a date prior to the ignore older's setting.
However, I ran control tests from which I saw unexpected behavior.
When I point my file path to a directory with an old log file, the file is expectedly skipped, with the corresponding debug entry
'skipping because it was last modified more than 86400.0 seconds ago'
The last entry in my debug file shows that the pipeline has started
When I copy a newer file to the same directory, that file is not discovered and picked up.
When I restart logstash it consumes the new file and skips the old file.
thank you for your answer. Changing the ignore_older setting did the trick.
Changing that setting also resolved the restarting Logstash after copying a new file to the directory problem.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.