Logstash - v:2.2.2 : input file path not working

I have the following stack installed on my Windows 8:
elasticsearch 2.20
logstash 2.2.2
kibana 4.4.1
I have a csv file from which I want to load the data into elasticsearch.

My configuration file is:

input {
file {
path => "C:/Users/andreealibotean/Desktop/ELK-datafeed-files/5135.csv"
type => "core2"
start_position => "beginning"

}
}
filter {
csv {
separator => ","
columns => ["Product Code","Make","Description","Availability","Price"]
}
mutate {convert => ["Availability", "integer"]}
mutate {convert => ["Price", "float"]}
}
output {
elasticsearch {
action => "index"
hosts => "localhost"
index => "5135-%{+YYYY.MM.dd}"
workers => 1
}
stdout {}
}

If I run " logstash -f --configtest ", it prints "Configuration OK".
When I try to upload the data (with: logstash -f path-to-the-config-file), it doesn't give any errors regarding the configuration, but it doesn't seem to load the data from the csv file into elasticsearch, it remains stuck,all it prints is:

Settings: default pipeline workers: 8
Logstash startup completed

I've read that it seems to be a problem with the syntax of the file path in the input, apparently logstash works with a unix-like path, so I even changed the path to be with "/" instead of "" or"\" in "C:/Users/andreealibotean/Desktop/ELK-datafeed-files/5135.csv "; it still doesn't load the data.

Any advice on how to handle this situation ?

Most likely Logstash is waiting for more data to be appended to the file. Read the file input's documentation about sincedb files to better understand what's going on.

I've modified it, I added a file: sincedb.out, referenced it in the configuration file:

input {
file {
path => "C:/Users/andreealibotean/Desktop/ELK-datafeed-files/5135.csv"
type => "core2"
start_position => "beginning"
sincedb_path => "D:/ELK/logstash-2.2.2/sincedb/sincedb.out"
}
}

Unfortunately, the data is still not being loaded, just like before adding the sincedb_path to the conf file. Am I not setting the sincedb_path properly ?

Looks okay. Increase the log level by passing --verbose when starting Logstash and read the log for clues.

I've started logstash with the --verbose option, everything looked fine.
I have modified the content of the csv file and after that it loaded the data, so it seems that sincedb_path didn't work as I expected. I actually receive a new csv file with the data each day at a given hour, so maybe I should implement a quickfix and configure the name of the csv file to be dd-mm-YYYY-5135.csv instead of 5135.csv (this is how it is currently), so logstash will see it as a new file each day thus loading the data without any inconveniences.