File plugin not reading files as expected

When set 'sincedb_path' to '/dev/null' and 'start_position' to 'beginning', logstash does not parse files as expected, it always start and hangs at output below:
Settings: Default pipeline workers: 4
Pipeline main started

in my case, I have a lot of existing logs, but I do not want logstash to monitoring the logs, I just want the logs to be parsed and sent to elastic search and be done with them. In addition, the logs are stored in a external hard drive formatted with NTFS filesystem mounted to Ubuntu.

heres my log.conf for testing on a single log file:

input {
file {
'type' => 'apacheSubdomainlog'
'path' => '/home/ubuntuser/logs/asubdomain_access_20160219.log'
'start_position' => 'beginning'
'sincedb_path' => '/dev/null'
}
}

Why not cat them into a stdin input then?

In a general apache access_log, an entry may look like this:

123.123.123.123 - www.placeholder.com - [24/Jan/2016:00:00:00 +1100] "GET /favicon.ico HTTP/1.1" 404 213 "-" "-" "-"

but for my purpose, all logs have a name like SubdomainName_access_20160218.log:

121.121.121.121 - - [19/Feb/2016:23:58:12 +1100] "GET /favicon.ico HTTP/1.1" 404 297

the "www.placeholder.com" is missing. therefore In order to get the domain name this log is relevant to, I need to use the file plugin and mutate the path of the file into domain name field in my index,

If the files are older than a day you need to adjust the file input's ignore_older option