I have installed 7.15.0 version of Logstash. I have the following config file:
input {
file {
path => "C:/Users/ELK Stack/data/sample.csv"
start_position => "beginning"
sincedb_path => "NUL"
}
}
filter {
csv {
separator => ","
autodetect_column_names => true
}
mutate {
remove_field => [ "message", "host", "path", "@version", "@timestamp" ]
}
mutate {
convert => {
"price" => "integer"
}
}
date {
match => ["expiryDate", "yyyyMMdd"]
target => "expiryDate"
timezone => "UTC"
}
}
output {
elasticsearch {
hosts => ["http://localhost:9200"]
index => "vehicle"
document_id => "%{licensePlate}"
}
}
But when running it with Logstash I see that the index isn't being populated correctly. The same document kept being rewritten with different values. The mappings were also wrong, for example if I had the following input file:
licensePlate,color,price,expiryDate
AB1234,blue,300000,20241115
one document looked like:
"AB1234": 300000
etc.
Can someone tell me if my config is wrong or is it some configuration that I need to fix? I have ran this exact file before and it was successful, but it was on a different laptop, so I'm not sure if I had a different configuration there maybe.