Hi i am trying to load geo data to elasticsearch using logstash ,
my .csv file look like this -
"col1","col2","col3","col4","col5"
"dilip","2017-05-28","45.5","12.56789,72.23456","43"
my .config file look like this -
input {
file {
path => "/home/dilip/Downloads/data/try1.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
csv {
separator => ","
columns => ["col1","col2","col3","col4","col5"]
}
mutate {convert =>["col3" , "float"]}
mutate {convert =>["col5" , "integer"]}
mutate {convert =>["col4" , "geo_point"]}
date { match => ["col2", "[yyyy-mm-dd HH:mm:ss:SSS]"]}
}
output {
elasticsearch {
hosts => "localhost"
index => "try_index3"
document_type => "trydata2"
}
stdout{}
}
but logstash creating pipeline and closing it . any suggestion how can i define geo code data .