I have the following stack installed on my Windows 8:
elasticsearch 2.20
logstash 2.2.2
kibana 4.4.1
I have a csv file from which I want to load the data into elasticsearch.
My configuration file is:
input {
file {
path => "C:/Users/andreealibotean/Desktop/ELK-datafeed-files/5135.csv"
type => "core2"
start_position => "beginning"
}
}
filter {
csv {
separator => ","
columns => ["Product Code","Make","Description","Availability","Price"]
}
mutate {convert => ["Availability", "integer"]}
mutate {convert => ["Price", "float"]}
}
output {
elasticsearch {
action => "index"
hosts => "localhost"
index => "5135-%{+YYYY.MM.dd}"
workers => 1
}
stdout {}
}
If I run " logstash -f --configtest ", it prints "Configuration OK".
When I try to upload the data (with: logstash -f path-to-the-config-file), it doesn't give any errors regarding the configuration, but it doesn't seem to load the data from the csv file into elasticsearch, it remains stuck,all it prints is:
Settings: default pipeline workers: 8
Logstash startup completed
I've read that it seems to be a problem with the syntax of the file path in the input, apparently logstash works with a unix-like path, so I even changed the path to be with "/" instead of "" or"\" in "C:/Users/andreealibotean/Desktop/ELK-datafeed-files/5135.csv "; it still doesn't load the data.
Any advice on how to handle this situation ?