Problem with large files & output

I have a case that works fine on logstash 1.5.x but breaks in 2.2, 2.3, and 5.0 (I haven't tried 2.0 or 2.1). I have hourly syslog files that are approximately 3 gig each. I'm trying to backfill logs going back to January. Here's my config on my shipper:

input {
file {
path => "/raid0/logs/201601*.log"
start_position => "beginning"
}
file {
path => "/raid0/logs/201602*.log"
start_position => "beginning"
}
file {
path => "/raid0/logs/201603*.log"
start_position => "beginning"
}
file {
path => "/raid0/logs/201604*.log"
start_position => "beginning"
}
file {
path => "/raid0/logs/201605*.log"
start_position => "beginning"
}
}

output {
rabbitmq {
durable => "false"
exchange => "logstash.crunch"
exchange_type => "direct"
host => "XXXXXXXX"
password => "XXXXXX"
user => "lXXXXXX"
}
}

Rabbit sees 0 traffic. I've also tried with stdout and file outputs and ended up with nothing. Logstash just sits practically idle.

@Janet This is really weird it should just be able to read everything, Any errors in the logs? You don't have any filters? If you run it with --debug do you get more details? Warning theses log can get a bit more noisy.

I simplified the config as much as I could to narrow down the issue. The logs showed nothing. I’ll try —debug and let you know what happens.

The plugin "logstash-input-file" has an argument called "ignore_older", by default, it will ignore the old file which was last modified before 24 hours. You can try to set it bigger to see whether it works well or not.

ignore_older

Value type is number
Default value is 86400
When the file input discovers a file that was last modified before the specified timespan in seconds, the file is ignored. After it’s discovery, if an ignored file is modified it is no longer ignored and any new data is read. The default is 24 hours.

I ran a script to touch all the files and update their timestamps - now they work fine. Thanks for the 86400 tip!