Logstash not parsing data thats not brand new


My Logstash to Elasticsearch workflow works fine, but only if I take the logs I am sending to Elasticsearch, open them in a text editor, delete them in the original location, and save them back to the area where my logstash .conf file is pointing to. It won't parse the files without doing this and these are brand new files that have never been parsed by Logstash or sent to Elasticsearch before.

Here is what my logstash.conf file looks like:

input {
    file {
        path => "/path/to/data/*"
        type => "data"
        start_position => "beginning"

filter {
    if [type] == "data" {
        csv {
            columns => ["Header","Header2","Header3"]
            separator => ","
output {
    if [type] == "data" {
        elasticsearch {
            action => "index"
            index => "prefetch"
            hosts => ""
            workers => 1
            user => "user"
            password => "pass"
    stdout {
        codec => rubydebug

I've got about 10 GB of data in the path, but upon running Logstash with ./bin/logstash -f logstash.conf I just get the following message in my terminal:

Settings: Default pipeline workers: 16
Pipeline main started

Then Logstash just sits there. Nothing happens. If I take one of the pieces of data from the /path/to/data directory, open it up in sublimetext, delete it from the original path, then save as back to the original path, Logstash then starts parsing it.

I've already deleted all the .sincedb* files from my $HOME directory, which didn't solve anything, as well as setting sincedb => "/dev/null/" in config file, which also didn't work. Please help. How can I get Logstash to parse all these files?

If you start Logstash with the --verbose option the file input will tell you more about what it's doing. My guess is that you need to adjust the file input's ignore_older to not skip files older than 24 hours.

Ah man. that worked. awesome. Thank you so much!