Hi there, I'm currently using ELK to index and visualise my data. At the moment, the data that I'm working with is sitting in a CSV file. once the below data is in ES i can visualise it in a metrics graph in kibana, 12 record are shown which is fine. When i then make a change to the csv by adding 1 row, my metric within Kibana jump to 38, why is this? bassically what I need to achieve is the current data to be reindexed, essentially replacing the current. I appreciate i have probably missed something blatantly obvious however any help would be appreciated.
my csv is as follows:
ID | car | color | date_cars |
---|---|---|---|
1 | ford | blue | 05/05/2018 |
2 | ford | blue | 06/05/2018 |
3 | honda | silver | 07/05/2018 |
4 | kier | silver | 08/05/2018 |
5 | kier | red | 09/05/2018 |
6 | bmw | red | 10/05/2018 |
7 | bmw | blue | 11/05/2018 |
8 | ford | white | 12/05/2018 |
9 | ford | red | 13/05/2018 |
10 | ford | green | 14/05/2018 |
11 | ford | silver | 15/05/2018 |
12 | ford | silver | 16/05/2018 |
config file:
input {
file {
path => "C:\Data\cars.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
csv {
separator => ","
columns => ["ID", "car", "colour", "date_cars"]
}
date{
match => ["date_cars","yyyy-MM-dd HH:mm:ss Z","ISO8601"]
target => "cars_date"
}
}
output {
elasticsearch {
hosts => "http://localhost:9200"
index => "cars"
}
stdout{}
}