Please help me to deal with geo_point in ES 2.1 and logstash 2.1. I am unable to set mapping of location as I use to do in old version using _river


(Rizwan) #1

I used ES old versions with _river to fetch data. For location i do following in mapping

"Location": {"type": "geo_point"},

and my data set , i return location.lat and location.lon.

I am unable to set type geo_point in ES 2.1 and Logstash 2.1.

Please help me regarding this. I want to use it in KIBANA.


(Rajeshkumar) #2

What is your index name you setting it in Logstash?


(Magnus Bäck) #3

More details please. What do your events look like, for example? The output of a stdout { codec => rubydebug } output would be helpful.


(Rizwan) #4

I have multiple indexes. Let say my index name is 'commission'.


(Rizwan) #5

If i am not wrong, You are asking about my logstash config file output event !

output {

if [type] == "A" {
    elasticsearch {
        hosts => "107.100.100.111"
        index => "test"
        document_type => "test_report"
        document_id => "%{uid}" 
    }    
}
if [type] == "B" {
elasticsearch {
    hosts => "107.100.100.111"
        index => "test1"
        document_type => "test1_report"
        document_id => "%{uid}"
    }    
}

stdout { codec => rubydebug }
}


(Magnus Bäck) #6

If i am not wrong, You are asking about my logstash config file output event !

No, I wanted to know what your events look like, preferably as presented by the stdout { codec => rubydebug } output already in your config file.


(Rajeshkumar) #7

@mrizwan.
Change your index name something like "logstash-commision", bec geo_point will work on the index name begin with logstash-


(Rajeshkumar) #8

Make sure you have used the geopoint filter in your Logstash config.


(Rizwan) #9

@rajkamalcool6

What should be the geopoint filter ?


(Rajeshkumar) #10

You need to mention this inside your filter section, then only geo point will get created.

geoip {
source => "field name containing IP value"
}

and also your index name should be in the following format "logstash-myindex"


(Rizwan) #11

@rajkamalcool6

I am not dealing with IPs here and my concern is about geopoint not geoip .
I am dealing with Longitudes and Latitudes. I am picking Long and Lat in my
sql data set. There are two columns, Latitude and Longitude so how should I
use them in my filter.


(Rajeshkumar) #12

I hope you can't map Longitudes and Latitudes separately. If you pass the IP field in geoip filter it automatically fetch the latitude and longitude field from the ip address.


(Magnus Bäck) #13

I hope you can't map Longitudes and Latitudes separately. If you pass the IP field in geoip filter it automatically fetch the latitude and longitude field from the ip address.

@rajkamalkool6, again, there is no IP address here so the geoip filter isn't useful. Enough about the geoip filter.

@mrizwan, there are two things at play here: the mapping of the field in Elasticsearch and what the document you send to Elasticsearch. The ES documentation about the geo_point type describes what fields must look like to be convertible to geo_point values. Make sure you comply with those rules. Also check the mapping if your index so that the target field is mapped as geo_point.


(Rizwan) #14

@magnusbaeck - Thanks for the explanation , I have completely understood your point.

What I am doing is, I have mapped that field like this in Elasticsearch " Location": {"type": "geo_point"} " and according to the ES compliance, geo_point requires two columns in data set like these
location.lat,
location.lon
Now, what I have to do to set these two columns values to geo_ip
I think I have to do something like this in our filter
grok{
geo_ip => [location.lat , location.long]
}


(Magnus Bäck) #15

For the third time, please show the output of the stdout { codec => rubydebug } output already in your config file. I can't suggest what you should do if I don't know what your events look like.


(system) #16