Files are not processed with file input unless I edit them

With the file input, I understand that sincedb traces if that file has been processed or not. If I have a new file, that never has being processed by logstash (so is not in sincedb).
My input config is like this:

input
{
file { start_position => "beginning" path => "/some/folder/*.log" add_field => { "client" => "myclient" "product" => "myprosuct" } }
}

If I move that file to the folder where logstash is "looking" (/some/folder/*.log), nothing happened, until I edit that file and change anything inside the file, that file is processed.. the same as it would be in sincedb.

How I know that is not in sincedb? because I execute:

$ ls -i file.log
96993940 /some/folder/file.log

Then:

$ grep 96993940 /var/lib/logstash/.sincedb*

and nothing...
Any clues whay this happen? I'm editing log files that each has 3GB... so it's painfully slow.

Start Logstash with --verbose and check your logs for clues.

I'll do it, when it finishes current files that are being processed. What I did, is a super not fancy way. To every file I ran something like this:

perl -p -i -e 'BEGIN { print "\n" }' file.log

I don't like it, but it works.
Credits: http://www.cyberciti.biz/faq/bash-prepend-text-lines-to-file/

I just find out that old files are not parsed, even If logstash never touched before (old meaning <1 day old or so). I found it because I had 5 files Logstash never touched it (i.e. weren't in .sincedb_*), 2 of those files were new (I created 5 minutes before) and only those 2 were processed. The 3 old ones weren't processed.
It seems almost like there is "something" that scan the whole disk at some point, and "tells" logstash those files are old and only have to touch them if something new arrives to them.
Interesting... and frustrating...

Ah, this again. I should've caught it. Look into the file input's ignore_older option.

That was it. My files are one or two months old, so I put ignore_older => 15552000 (six months).
Thank you!