UK Post Code to geo data in Logstash?

I'm looking for a way to add Geo location data to documents based on UK post code (no IP data, address data only).

I've searched but have been unable to find any results and have to admit that I am not a developer so am not sure what approach to take.

It looks like the api would provide the info I require: -

Am running latest versions of Elasticsearch (2.2), Logstash (2.2) and Kibana (4.4) as of today.

If anybody has post code to geo mapping working then some guidance would be much appreciated.


If you can obtain a full dump of the data you could store it in a local file and use the translate filter to map your postcodes to a geolocation. If you really need to make the API lookups you'll probably have to write a custom plugin. It should cache the results to not slam the API endpoint with too many requests. I don't know about a general REST lookup plugin that would do this for you.

Thanks @magnusbaeck.

I've obtained a dump of the data and can convert to required format but am struggling to get the translate filter to read the dictionary file. To test I have created a small file: -


and am using Logstash filter: -

                    translate {
                            dictionary_path => "/etc/logstash/POLatLong.yaml"
                            field => "hl7PostCode"
                            destination => "[geoip][location]"

but get error: -
" LogStash::Filters::Translate: Bad Syntax in dictionary file /etc/logstash/POLatLong.yaml"

I've tried placing double quotes around each side of the : but with no luck. I'm sure it's a simple one but any pointers?



Scratch that - just found another post and spotted that I needed a space after the ": " field delimiter in the .yaml file.

Have added that and now working OK.

Thanks for your assistance @magnusbaeck

Hi all,

Did you get this functioning as i have ended up at the same point with a yaml file generated from National Statistic data (2.5million lines) and it just hangs. Here is a sample:

AB10AA: 57.101474,-2.242851
AB10AB: 57.102554,-2.246308
AB10AD: 57.100556,-2.248342

If i use a small set (500 lines) it works fine. Does this plugin have a line limit?


The YAML parser might be running out of memory. CSV is probably cheaper. You'll probably need to extend Logstash's heap size.

Can you use CSV for a data lookup against an external source to populate a field?


[root@simont-fedvm ls-postcode-geoip]# LS_HEAP_SIZE="4g" /opt/logstash/bin/logstash -f /root/ls-postcode-geoip/ls-test-translate.conf
Settings: Default pipeline workers: 4
Pipeline main started
BD20 0NZ
"message" => "BD200NZ",
"@version" => "1",
"@timestamp" => "2016-07-15T16:18:29.047Z",
"host" => "simont-fedvm",
"long-lat" => "53.913991,-1.931380"

All working, cheers.

Can you use CSV for a data lookup against an external source to populate a field?

How do you mean? CSV is a flat text file. Unfortunately the translate filter doesn't support lookups again e.g. databases.

Hi Lee,

Found this thread cos I'd doing the same thing, then noticed your field name had "hl7" I'm working using ELK for storing HL7 and just wondered what ideas you had? and how you were getting HL7 into elasticsearch?