Need some help with Geo Enrichment - SOLVED

Hi All,

I have a logstash configuration which is getting sales activity from a JDBC connection.
This data has a locatorID.

My configuration looks like:
`input {
jdbc {
jdbc_connection_string => "jdbcstring"
jdbc_user => "user"
jdbc_password => "password"
jdbc_driver_library => "/Users/sqljdbc4.jar"
jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
statement => "SELECT * from SALES where PURCHASE_DATE_ET = '2016-02-19' and STANDARD_AMOUNT > 0"
}
}

output {
	stdout { codec => json_lines }    

	elasticsearch {
        index => "purchases"
        document_type => "purchase"

    }
}

`

I then have a separate file with with a reference to lookup the locatorID to a log & lat configuration. The CSV file is like this:

1,"Goroka","Goroka","Papua New Guinea","GKA","AYGA",-6.081689,145.391881,5282,10,"U","Pacific/Port_Moresby"

Where the format is:

ID, City, Name, Country, Ref1, Ref2, Latitude,Longitude, Alt,Timezone, Offset,TzTimezone.

How do I join and enrich the two steps prior to loading into Elastic?

Thanks
Xathrasindent preformatted text by 4 spaces

Check out the translate filter - https://www.elastic.co/guide/en/logstash/2.1/plugins-filters-translate.html

You can use that to pull the lat+long in and then merge into a single geo array.

Thanks for the help. I am getting further but now have some issues with the translation

I am now getting the following error:
Settings: Default pipeline workers: 8
The error reported is:
LogStash::Filters::Translate: Bad Syntax in dictionary file geocord.yaml

The geocord file looks like:

  • ICAO: AYGA
    GEO: -6.081689,145.391881
  • ICAO: AYMD
    GEO: -5.207083,145.7887
  • ICAO: AYMH
    GEO: -5.826789,144.295861

My filter and translate section now includes:
filter{ translate { field => "destination_airport_code" destination=> "ICAO" dictionary_path => "geocord.yaml" } }

Any ideas?

Your file is not valid, it should be "AYGA": -6.081689,145.39188.
See the example here https://www.elastic.co/guide/en/logstash/2.1/plugins-filters-translate.html#plugins-filters-translate-dictionary_path

Thanks, but I tried that and it didn't work so went to a yaml generator to see if that was the problem. here is an example of the file:

"AYGA":-6.081689,145.391881
"AYMD":-5.207083,145.7887
"AYMH":-5.826789,144.295861

Ugh, I found it. My bad look at the simple difference:

"KORD": "41.978603,-87.904842"

Thank you so much for the gentle nudge

You should either create a template or a mapping for the index that sets this field as a geopoint, there's a bunch of threads that can help, but shout out if you get stuck.

Really need some help warkolm, banging my head on this one :frowning:

Here is what I've been able to do thus far:

  1. Add in a filter mutate to split airport by the ',' so we now have an array from the string containing the latitude and longitude
  2. Add in a filter mutate to add the fields called latitude and longitude defined as airport[0] and airport[1]
  3. Then add in a filter mutate to convert them from string which is what natively is happening into a float
  4. In filter mutate name the longitude and latitude as: [location][lon] & [location][lat]
  5. In my output section I define manage_template as true and pass in the template location and template_override = true

When I run I get the following error: "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"Failed to parse mapping [default]: Mapping definition for [location] has unsupported parameters: [dynamic : true]", "caused_by"=>{"type"=>"mapper_parsing_exception", "reason"=>"Mapping definition for [location] has unsupported parameters: [dynamic : true]"}}}}, :level=>:warn}

Here is the config
input {
jdbc {
jdbc_connection_string => "jdbc:sqlserver://host:1433;Database=db"
jdbc_user => "user"
jdbc_password => "password"
jdbc_driver_library => "/Users/wtaylor/Downloads/sqljdbc_4.0/enu/sqljdbc4.jar"
jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
statement => "SELECT top 5 * from SALES where PURCHASE_DATE_ET = '2016-02-22' and STANDARD_AMOUNT > 0"
}
}

filter {
translate {
field => "destination_airport_code"
dictionary_path => "/Users/wtaylor/Downloads/logstash-2.2.2/bin/geocord.yaml"
fallback => "unknown"
destination=> "airport"
}
}

filter {
mutate {
split => {"airport" => ","}
}
}

filter {
mutate {
add_field => ["latitude","%{[airport[0]}"]
add_field => ["longitude","%{[airport[1]}"]
}
}

filter {
mutate {
convert => { "longitude" => "float" }
convert => { "latitude" => "float" }
}
}

filter{
mutate {
rename => {
"longitude" => "[location][lon]"
"latitude" => "[location][lat]"
}
}
}

output {
stdout { codec => json_lines }

elasticsearch {
index => "bre"
document_type => "purchase"
manage_template => true
template => "/Users/wtaylor/Downloads/logstash-2.2.2/bin/template.json"
template_overwrite=>"true"
}
}

My Template Json looks like:
{
"template" : "bre",
"settings" : {
"index.refresh_interval" : "5s"
},
"mappings" : {
"default" : {
"_all" : {"enabled" : true, "omit_norms" : true},
"properties" : {
"@timestamp": { "type": "date", "doc_values" : true },
"@version": { "type": "string", "index": "not_analyzed", "doc_values" : true },
"location" : {
"type" : "geo_point",
"dynamic": true,
"doc_values" : true,
"lat_lon": true
},
"geoip" : {
"type" : "object",
"dynamic": true,
"properties" : {
"ip": { "type": "ip", "doc_values" : true },
"latitude" : { "type" : "float", "doc_values" : true },
"longitude" : { "type" : "float", "doc_values" : true }
}
}

  }
}

}
}

1 Like

I fixed it :slightly_smiling:

need to make some amendments to my template but was able to get some fields with Geo working

That's not valid, so remove it from both location and geoip and you should be good.

warkolm, thanks so much for all your support. Per my note above i was able to solve and as you mentioned that the change I made.

Working on some weird items on my map showing points on a heat map that I know isn't in the source but

How did you get it to work? It would help if you post your solution.