Hi i am trying to load geo data to elasticsearch using logstash ,
my .csv file look like this -
"col1","col2","col3","col4","col5"
"dilip","2017-05-28","45.5","12.56789,72.23456","43"
my .config file look like this -
input {
file {
path => "/home/dilip/Downloads/data/try1.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
csv {
separator => ","
columns => ["col1","col2","col3","col4","col5"]
}
mutate {convert =>["col3" , "float"]}
mutate {convert =>["col5" , "integer"]}
mutate {convert =>["col4" , "geo_point"]}
date { match => ["col2", "[yyyy-mm-dd HH:mm:ss:SSS]"]}
}
output {
elasticsearch {
hosts => "localhost"
index => "try_index3"
document_type => "trydata2"
}
stdout{}
}
but logstash creating pipeline and closing it . any suggestion how can i define geo code data .
warkolm
(Mark Walkom)
July 11, 2017, 9:12am
2
That is not valid.
You need to join the lat and lon fields into a single one.
DILIP_SHARMA:
columns => ["col1","col2","col3","col4","col5"]
It'd be easier to give them human readable names
i just created test file thats why given name like this .
i already joined both in col4 column if you look my data "12.56789,72.23456"
is there anything else i need to do ?
warkolm
(Mark Walkom)
July 11, 2017, 9:15am
4
Make sure you have a template/mapping that sets the field to a geo_point
in Elasticsearch.
sorry not sure , you are talking about this ??
{
"mappings": {
"my_type": {
"properties": {
"location": {
"type": "geo_point"
}
}
}
}
}
where i need to set this ?
warkolm
(Mark Walkom)
July 11, 2017, 9:39am
6
Yes, that needs to live in Elasticsearch against the index.
system
(system)
Closed
August 8, 2017, 9:39am
7
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.