Mapping


#1

My logstash file,
filter {
csv {
columns => [
"status",
"country",
"state",
"city"
]
separator => ","
}

}
And I am mapping by using geo_point data field,

"mappings" : {
"default" : {
"properties" : {
"geohashlocation": {
"geohash": true,
"type": "geo_point",
"geohash_prefix": true },
"status" : {"type": "string", "index" : "not_analyzed" },
"country" : {"type": "geo_point"},
"state" : {"type": "geo_point"},
"city" : {"type": "geo_point"}


(Mark Walkom) #2

That is a terrible subject, please update it to something more descriptive.
Also that screenshot is also very hard to read, it'd be better if you pasted the text instead.

And please provide a sample of the data so we can see what you are trying to parse.


#3

I am Sorry.

Actually i am trying to load a csv file which have fields country, state, city. so i created a logstash config file which is,
filter {
csv {
columns => [
"status",
"country",
"state",
"city"
]
separator => ","
}
And i created my index by mapping these fields by geo_point data type my mapping looks like,
"mappings" : {
"default" : {
"properties" : {
"geohashlocation": {
"geohash": true,
"type": "geo_point",
"geohash_prefix": true },
"status" : {"type": "string", "index" : "not_analyzed" },
"country" : {"type": "geo_point"},
"state" : {"type": "geo_point"},
"city" : {"type": "geo_point"}

When I run my conf file i am getting the error "reason failed to parse"


(Mark Walkom) #4

I got that, you need to provide a sample of the data so we can see what is happening.


#5

My data looks like,

status,country,state,city
Ongoing,US,WA,Vancouver
Ongoing,US,NC,Raleigh
Ongoing,US,MO,Saint Louis


(Mark Walkom) #6

That's not a valid data set to be putting into a geo_point, it expects a literal latitude and longitude numerical data.


(system) #7