Hi all ,
whenever my logstash start , its load the same file again , this causing duplicate data in EC .
this is my config file -
input {
file {
path => "E:\Local_Elasticsearch\logstashv5\datafiles/*.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
ignore_older => 0
}
}
filter {
csv {
separator => ","
columns => ["A","B","C","D","E","F"]
}
mutate {convert =>[F" , "integer"]}
date {
match => [ "B", "ISO8601", "YYYY-MM-dd HH:mm:ss" ]
target => "B"
locale => "en"
}
date {
match => [ "D", "ISO8601", "YYYY-MM-dd HH:mm:ss" ]
target => "D"
locale => "en"
}
date {
match => [ "E", "ISO8601", "YYYY-MM-dd HH:mm:ss" ]
target => "E"
locale => "en"
}
}
output {
elasticsearch {
hosts => "localhost"
index => "testindex1"
document_type => "tesdata1"
}
stdout{}
}
and my csv file
A,B,C,D,E,F
A1,2017/09/16 00:00:00,3U0604,2017/09/16 19:22:00,2017/09/16 19:30:00,1
A2,2017/09/16 00:00:00,3U0604,2017/09/16 19:22:00,2017/09/16 19:30:00,2
A3,2017/09/16 00:00:00,9W0527,2017/09/16 11:59:00,2017/09/16 12:10:00,3
A4,2017/09/16 00:00:00,9W0531,2017/09/16 02:45:00,2017/09/16 02:35:00,4
A5,2017/09/16 00:00:00,9W0535,2017/09/16 14:56:00,2017/09/16 15:10:00,5
A6,2017/09/16 00:00:00,9W0537,2017/09/16 18:30:00,2017/09/16 18:35:00,6
A7,2017/09/16 00:00:00,9W0541,2017/09/16 12:05:00,2017/09/16 12:00:00,7
any suggestion to solve this issue .
Thanks