File plugin doesn't read file

Hello,

I just upgraded logstash version to 2.3.2. My config file worked fine on the previous version I had (2.3.1 i think) but since I upgraded, it seems like the log files I specified for the file plugin aren't read. It works just fine with the stdin plugin though.

I have the following configuration for the file plugin:
file {
path => ["/home/osadmin/logs/com/.log" , "/home/osadmin/logs/ofr/.log"]
sincedb_path => "/home/osadmin/logstash/sincebd_com_ofr"
start_position => "beginning"
}

As an output, I have the elasticsearch plugin and stdout but nothing goes on my elasticsearch node and nothing is printed on the standard output for stdout, just the launching messages:

Settings: Default pipeline workers: 2
Pipeline main started

Does anyone knows why it is not working?

Thanks for your help,
Noémie

Is it a sincedb issue?
I guess that is hard to test if you have lost of old files sitting there though.

Hi,

There is actually no sincedb file created. It like the file plugin could'nt find the files I specified in "path". I checked ans double-checked and the file are exactly where they are supposed to be and at the exact same path than specified in the file plugin.

Check the Logstash logs for clues. You'll probably want to start Logstash --verbose.

After a few manipulations, it seems like Logsatsh doesn't read files except if I execute a touch on the files I want to be read. My configuration is the following one:

input {
file {
path => ["path_to_logs/*.log"]
sincedb_path => "path_to_sincedb_file/sincedb"
start_position => "beginning"
}
}

I tried by removing the sincedb file but even then, I have to execute the touch command on the files have to process so they can be taken into consideration for processing.

Also, when I add files into the folder where the files to read are, Logstash doesn't seem to see them and don't process them.

Do you have any idea on what it could be?

Thanks for your help,
Noémie

I had experienced a similar issue and checked into the internals of the input file plugin and file watch plugin.

I saw that logstash was using a combination of the configurations for input_file to determine if it should pick up the files

In the file watch gem, it intakes the close_older and ignore_older parameters and compares them against the filestat's mtime and atime for each file.

A link with more details on the default paramaters: https://www.elastic.co/guide/en/logstash/current/plugins-inputs-file.html#plugins-inputs-file-close_older

That explains why the touch command would cause the files to be picked up. Because the access time and modify time must be from a date prior to the ignore older's setting.

However, I ran control tests from which I saw unexpected behavior.

When I point my file path to a directory with an old log file, the file is expectedly skipped, with the corresponding debug entry

'skipping because it was last modified more than 86400.0 seconds ago'

The last entry in my debug file shows that the pipeline has started

When I copy a newer file to the same directory, that file is not discovered and picked up.

When I restart logstash it consumes the new file and skips the old file.

Any help and/or explanation would be appreciated

input {
	file{
		path => ["/Input-Files-2/*"]
		codec => multiline {
			pattern => "^%{NUMBER:line_num}"
			negate => true
			what => "previous"
		} 
	discover_interval => 420
	start_position => "beginning"
	sincedb_path => "logs/sincedb"     
	stat_interval => 10
	}
}

Hello,

thank you for your answer. Changing the ignore_older setting did the trick.
Changing that setting also resolved the restarting Logstash after copying a new file to the directory problem.