How to process "Lat" & "Long" fields using default Logstash config and mapping to use in Kibana 4 tile map

Hi there!

My question is fairly simple but I'm having trouble finding a solution. I
have a csv file containing Lat and Lon coordinates in separate fields named
"Latitude" and "Longitude". Most of the info I found on the net is focussed
on GeoIP (which is great functionality btw) but besides some posts
https://groups.google.com/forum/#!topic/elasticsearch/QaI1fj74RlMin
Google Groups I failed finding a good tutorial for this use-case.

What is the simplest way of getting separate Long / Lat fields into a
geo_point and putting these coordinates on a Tile Map in Kibana 4 using the
default Logstash (mapping) - ES - Kibana settings? I am using logstash
1.4.2 | Elasticsearch 1.5.0. and Kibana 4.0.1.

Summary: --> csv containing Long / Lat in separate fields --> Logstash -->
ES --> Kibana4?

Any help very much appreciated!

Cheers,

Rodger

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/015a16b8-4379-42a6-8220-96be75600377%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

You need to use the mutate filter and move your fields into a "location" one.

--
David :wink:
Twitter : @dadoonet / @elasticsearchfr / @scrutmydocs

Le 25 avr. 2015 à 22:13, Rodger Moore rodgerblom@gmail.com a écrit :

Hi there!

My question is fairly simple but I'm having trouble finding a solution. I have a csv file containing Lat and Lon coordinates in separate fields named "Latitude" and "Longitude". Most of the info I found on the net is focussed on GeoIP (which is great functionality btw) but besides some posts in Google Groups I failed finding a good tutorial for this use-case.

What is the simplest way of getting separate Long / Lat fields into a geo_point and putting these coordinates on a Tile Map in Kibana 4 using the default Logstash (mapping) - ES - Kibana settings? I am using logstash 1.4.2 | Elasticsearch 1.5.0. and Kibana 4.0.1.

Summary: --> csv containing Long / Lat in separate fields --> Logstash --> ES --> Kibana4?

Any help very much appreciated!

Cheers,

Rodger

You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/015a16b8-4379-42a6-8220-96be75600377%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/3BF1A3A0-290C-48EA-84F1-58BB92BEAC2D%40pilato.fr.
For more options, visit https://groups.google.com/d/optout.

Hi David,

Thanks but I'm struggling getting this done. I tried different things. For
example:

  csv {
    columns => ["NAME","DATE","LAT","LONG"]
    separator => ";"
    add_tag => ["csv_parse_successfull"]
  }

  date {
    match => ["DATE", "dd-MM-YYYY HH:mm:ss"]
  }

  mutate {
    add_field => ["temp", "%{LAT}, %{LONG}"]
  }

  mutate {
    replace => ["geoip.location", "%{temp}"]
  }

}

For some reason this converts the "geoip.location" type instead of keeping
it a geo_point type filled with ["%{LAT}, %{LONG}"].

Any suggestions?

Thanks

Op zaterdag 25 april 2015 23:01:29 UTC+2 schreef David Pilato:

You need to use the mutate filter and move your fields into a "location"
one.

--
David :wink:
Twitter : @dadoonet / @elasticsearchfr / @scrutmydocs

Le 25 avr. 2015 à 22:13, Rodger Moore <rodge...@gmail.com <javascript:>>
a écrit :

Hi there!

My question is fairly simple but I'm having trouble finding a solution. I
have a csv file containing Lat and Lon coordinates in separate fields named
"Latitude" and "Longitude". Most of the info I found on the net is focussed
on GeoIP (which is great functionality btw) but besides some posts
https://groups.google.com/forum/#!topic/elasticsearch/QaI1fj74RlMin
Google Groups I failed finding a good tutorial for this use-case.

What is the simplest way of getting separate Long / Lat fields into a
geo_point and putting these coordinates on a Tile Map in Kibana 4 using the
default Logstash (mapping) - ES - Kibana settings? I am using logstash
1.4.2 | Elasticsearch 1.5.0. and Kibana 4.0.1.

Summary: --> csv containing Long / Lat in separate fields --> Logstash -->
ES --> Kibana4?

Any help very much appreciated!

Cheers,

Rodger

--
You received this message because you are subscribed to the Google Groups
"elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an
email to elasticsearc...@googlegroups.com <javascript:>.
To view this discussion on the web visit
https://groups.google.com/d/msgid/elasticsearch/015a16b8-4379-42a6-8220-96be75600377%40googlegroups.com
https://groups.google.com/d/msgid/elasticsearch/015a16b8-4379-42a6-8220-96be75600377%40googlegroups.com?utm_medium=email&utm_source=footer
.
For more options, visit https://groups.google.com/d/optout.

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/60d9811d-c960-40fe-9e61-dfb212c2602d%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Hi there again!

This problem is caused by, what I believe, a bug in Logstash or
Elasticsearch. I used a very small test csv file with only 1 or 2 records
per date. The default Logstash template creates 1 index per date. For some
reason the creation of indices goes wrong when it comes to field types and
very few records per index. After I changed the index creation template in
the output config to:

output {

elasticsearch {
protocol => "http"
index => "logstash-%{+YYYY.MM}"
}
}

thus creating only 1 index per month the problem with wrong field types was
gone. If the folks from Elastic want to reproduce this, I enclosed the
config files and test file. Changed status to solved.

Cheers,

Rodger.

Op zaterdag 25 april 2015 22:13:45 UTC+2 schreef Rodger Moore:

Hi there!

My question is fairly simple but I'm having trouble finding a solution. I
have a csv file containing Lat and Lon coordinates in separate fields named
"Latitude" and "Longitude". Most of the info I found on the net is focussed
on GeoIP (which is great functionality btw) but besides some posts
https://groups.google.com/forum/#!topic/elasticsearch/QaI1fj74RlMin
Google Groups I failed finding a good tutorial for this use-case.

What is the simplest way of getting separate Long / Lat fields into a
geo_point and putting these coordinates on a Tile Map in Kibana 4 using the
default Logstash (mapping) - ES - Kibana settings? I am using logstash
1.4.2 | Elasticsearch 1.5.0. and Kibana 4.0.1.

Summary: --> csv containing Long / Lat in separate fields --> Logstash -->
ES --> Kibana4?

Any help very much appreciated!

Cheers,

Rodger

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/604b6de2-5c3f-4568-9f2a-8bf7e27611a2%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

It's not an issue IMO but just a default configuration.

FYI here is a sample config file I just used to parse some CSV data:

input {
stdin {}
}

filter {
csv {
separator => ";"
columns => [
"id","name","slug","uic","uic8_sncf","longitude","latitude",
"parent_station_id","is_city","country",
"is_main_station","time_zone","is_suggestable","sncf_id",
"sncf_is_enabled","idtgv_id","idtgv_is_enabled",
"db_id","db_is_enabled","idbus_id","idbus_is_enabled","ouigo_id",
"ouigo_is_enabled",
"trenitalia_id","trenitalia_is_enabled","ntv_id","ntv_is_enabled",
"info_fr",
"info_en","info_de","info_it","same_as"
]
}

if [id] == "id" {
drop { }
} else {
mutate {
convert => { "longitude" => "float" }
convert => { "latitude" => "float" }
}

mutate {
  rename => {
    "longitude" => "[location][lon]" 
    "latitude" => "[location][lat]" 
  }
}


mutate {
  remove_field => [ "message", "host", "@timestamp", "@version" ]
}

}
}

output {

stdout { codec => rubydebug }

stdout { codec => dots }
elasticsearch {
protocol => "http"
host => "localhost"
index => "sncf"
index_type => "gare"
template => "sncf_template.json"
template_name => "sncf"
document_id => "%{id}"
}
}

Hope this helps

Le dimanche 26 avril 2015 13:50:54 UTC+2, Rodger Moore a écrit :

Hi there again!

This problem is caused by, what I believe, a bug in Logstash or
Elasticsearch. I used a very small test csv file with only 1 or 2 records
per date. The default Logstash template creates 1 index per date. For some
reason the creation of indices goes wrong when it comes to field types and
very few records per index. After I changed the index creation template in
the output config to:

output {

elasticsearch {
protocol => "http"
index => "logstash-%{+YYYY.MM}"
}
}

thus creating only 1 index per month the problem with wrong field types
was gone. If the folks from Elastic want to reproduce this, I enclosed the
config files and test file. Changed status to solved.

Cheers,

Rodger.

Op zaterdag 25 april 2015 22:13:45 UTC+2 schreef Rodger Moore:

Hi there!

My question is fairly simple but I'm having trouble finding a solution. I
have a csv file containing Lat and Lon coordinates in separate fields named
"Latitude" and "Longitude". Most of the info I found on the net is focussed
on GeoIP (which is great functionality btw) but besides some posts
https://groups.google.com/forum/#!topic/elasticsearch/QaI1fj74RlMin
Google Groups I failed finding a good tutorial for this use-case.

What is the simplest way of getting separate Long / Lat fields into a
geo_point and putting these coordinates on a Tile Map in Kibana 4 using the
default Logstash (mapping) - ES - Kibana settings? I am using logstash
1.4.2 | Elasticsearch 1.5.0. and Kibana 4.0.1.

Summary: --> csv containing Long / Lat in separate fields --> Logstash
--> ES --> Kibana4?

Any help very much appreciated!

Cheers,

Rodger

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/df9e85b7-35b8-4f3e-b361-fe8d2c33cac9%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Hi David,

Thanks again for your answer. For some reason I am doing something wrong
and its driving me nuts. I've tried your method but the tile map is showing
me no results whatsoever. How did you define your template in Elasticsearch
for this "location" field?

Thanks,

Rodger

Op zondag 26 april 2015 18:34:01 UTC+2 schreef David Pilato:

It's not an issue IMO but just a default configuration.

FYI here is a sample config file I just used to parse some CSV data:

input {
stdin {}
}

filter {
csv {
separator => ";"
columns => [
"id","name","slug","uic","uic8_sncf","longitude","latitude",
"parent_station_id","is_city","country",
"is_main_station","time_zone","is_suggestable","sncf_id",
"sncf_is_enabled","idtgv_id","idtgv_is_enabled",
"db_id","db_is_enabled","idbus_id","idbus_is_enabled","ouigo_id",
"ouigo_is_enabled",
"trenitalia_id","trenitalia_is_enabled","ntv_id","ntv_is_enabled",
"info_fr",
"info_en","info_de","info_it","same_as"
]
}

if [id] == "id" {
drop { }
} else {
mutate {
convert => { "longitude" => "float" }
convert => { "latitude" => "float" }
}

mutate {
  rename => {
    "longitude" => "[location][lon]" 
    "latitude" => "[location][lat]" 
  }
}


mutate {
  remove_field => [ "message", "host", "@timestamp", "@version" ]
}

}
}

output {

stdout { codec => rubydebug }

stdout { codec => dots }
elasticsearch {
protocol => "http"
host => "localhost"
index => "sncf"
index_type => "gare"
template => "sncf_template.json"
template_name => "sncf"
document_id => "%{id}"
}
}

Hope this helps

Le dimanche 26 avril 2015 13:50:54 UTC+2, Rodger Moore a écrit :

Hi there again!

This problem is caused by, what I believe, a bug in Logstash or
Elasticsearch. I used a very small test csv file with only 1 or 2 records
per date. The default Logstash template creates 1 index per date. For some
reason the creation of indices goes wrong when it comes to field types and
very few records per index. After I changed the index creation template in
the output config to:

output {

elasticsearch {
protocol => "http"
index => "logstash-%{+YYYY.MM}"
}
}

thus creating only 1 index per month the problem with wrong field types
was gone. If the folks from Elastic want to reproduce this, I enclosed the
config files and test file. Changed status to solved.

Cheers,

Rodger.

Op zaterdag 25 april 2015 22:13:45 UTC+2 schreef Rodger Moore:

Hi there!

My question is fairly simple but I'm having trouble finding a solution.
I have a csv file containing Lat and Lon coordinates in separate fields
named "Latitude" and "Longitude". Most of the info I found on the net is
focussed on GeoIP (which is great functionality btw) but besides some
posts
https://groups.google.com/forum/#!topic/elasticsearch/QaI1fj74RlMin
Google Groups I failed finding a good tutorial for this use-case.

What is the simplest way of getting separate Long / Lat fields into a
geo_point and putting these coordinates on a Tile Map in Kibana 4 using the
default Logstash (mapping) - ES - Kibana settings? I am using logstash
1.4.2 | Elasticsearch 1.5.0. and Kibana 4.0.1.

Summary: --> csv containing Long / Lat in separate fields --> Logstash
--> ES --> Kibana4?

Any help very much appreciated!

Cheers,

Rodger

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/61c9e345-c997-43ac-ab58-7c753fecf0f0%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Lucky you! I just blogged about it :slight_smile:

--
David :wink:
Twitter : @dadoonet / @elasticsearchfr / @scrutmydocs

Le 28 avr. 2015 à 18:07, Rodger Moore rodgerblom@gmail.com a écrit :

Hi David,

Thanks again for your answer. For some reason I am doing something wrong and its driving me nuts. I've tried your method but the tile map is showing me no results whatsoever. How did you define your template in Elasticsearch for this "location" field?

Thanks,

Rodger

Op zondag 26 april 2015 18:34:01 UTC+2 schreef David Pilato:

It's not an issue IMO but just a default configuration.

FYI here is a sample config file I just used to parse some CSV data:

input {
stdin {}
}

filter {
csv {
separator => ";"
columns => [
"id","name","slug","uic","uic8_sncf","longitude","latitude","parent_station_id","is_city","country",
"is_main_station","time_zone","is_suggestable","sncf_id","sncf_is_enabled","idtgv_id","idtgv_is_enabled",
"db_id","db_is_enabled","idbus_id","idbus_is_enabled","ouigo_id","ouigo_is_enabled",
"trenitalia_id","trenitalia_is_enabled","ntv_id","ntv_is_enabled","info_fr",
"info_en","info_de","info_it","same_as"
]
}

if [id] == "id" {
drop { }
} else {
mutate {
convert => { "longitude" => "float" }
convert => { "latitude" => "float" }
}

mutate {
  rename => {
    "longitude" => "[location][lon]" 
    "latitude" => "[location][lat]" 
  }
}


mutate {
  remove_field => [ "message", "host", "@timestamp", "@version" ]
}

}
}

output {

stdout { codec => rubydebug }

stdout { codec => dots }
elasticsearch {
protocol => "http"
host => "localhost"
index => "sncf"
index_type => "gare"
template => "sncf_template.json"
template_name => "sncf"
document_id => "%{id}"
}
}

Hope this helps

Le dimanche 26 avril 2015 13:50:54 UTC+2, Rodger Moore a écrit :

Hi there again!

This problem is caused by, what I believe, a bug in Logstash or Elasticsearch. I used a very small test csv file with only 1 or 2 records per date. The default Logstash template creates 1 index per date. For some reason the creation of indices goes wrong when it comes to field types and very few records per index. After I changed the index creation template in the output config to:

output {

elasticsearch {
protocol => "http"
index => "logstash-%{+YYYY.MM}"
}
}

thus creating only 1 index per month the problem with wrong field types was gone. If the folks from Elastic want to reproduce this, I enclosed the config files and test file. Changed status to solved.

Cheers,

Rodger.

Op zaterdag 25 april 2015 22:13:45 UTC+2 schreef Rodger Moore:

Hi there!

My question is fairly simple but I'm having trouble finding a solution. I have a csv file containing Lat and Lon coordinates in separate fields named "Latitude" and "Longitude". Most of the info I found on the net is focussed on GeoIP (which is great functionality btw) but besides some posts in Google Groups I failed finding a good tutorial for this use-case.

What is the simplest way of getting separate Long / Lat fields into a geo_point and putting these coordinates on a Tile Map in Kibana 4 using the default Logstash (mapping) - ES - Kibana settings? I am using logstash 1.4.2 | Elasticsearch 1.5.0. and Kibana 4.0.1.

Summary: --> csv containing Long / Lat in separate fields --> Logstash --> ES --> Kibana4?

Any help very much appreciated!

Cheers,

Rodger

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/61c9e345-c997-43ac-ab58-7c753fecf0f0%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/9495D301-7A2E-47E5-A5E0-37D759E7AEFE%40pilato.fr.
For more options, visit https://groups.google.com/d/optout.

1 Like

Merci beaucoup, even in English!

Nice blog, this helps a lot :slight_smile:

Cheers

Op dinsdag 28 april 2015 20:45:12 UTC+2 schreef David Pilato:

Lucky you! I just blogged about it :slight_smile:

https://david.pilato.fr/blog/2015-04-28-exploring-capitaine-train-dataset/

--
David :wink:
Twitter : @dadoonet / @elasticsearchfr / @scrutmydocs

Le 28 avr. 2015 à 18:07, Rodger Moore <rodge...@gmail.com <javascript:>>
a écrit :

Hi David,

Thanks again for your answer. For some reason I am doing something wrong
and its driving me nuts. I've tried your method but the tile map is showing
me no results whatsoever. How did you define your template in Elasticsearch
for this "location" field?

Thanks,

Rodger

Op zondag 26 april 2015 18:34:01 UTC+2 schreef David Pilato:

It's not an issue IMO but just a default configuration.

FYI here is a sample config file I just used to parse some CSV data:

input {
stdin {}
}

filter {
csv {
separator => ";"
columns => [
"id","name","slug","uic","uic8_sncf","longitude","latitude",
"parent_station_id","is_city","country",
"is_main_station","time_zone","is_suggestable","sncf_id",
"sncf_is_enabled","idtgv_id","idtgv_is_enabled",
"db_id","db_is_enabled","idbus_id","idbus_is_enabled","ouigo_id",
"ouigo_is_enabled",
"trenitalia_id","trenitalia_is_enabled","ntv_id","ntv_is_enabled",
"info_fr",
"info_en","info_de","info_it","same_as"
]
}

if [id] == "id" {
drop { }
} else {
mutate {
convert => { "longitude" => "float" }
convert => { "latitude" => "float" }
}

mutate {
  rename => {
    "longitude" => "[location][lon]" 
    "latitude" => "[location][lat]" 
  }
}


mutate {
  remove_field => [ "message", "host", "@timestamp", "@version" ]
}

}
}

output {

stdout { codec => rubydebug }

stdout { codec => dots }
elasticsearch {
protocol => "http"
host => "localhost"
index => "sncf"
index_type => "gare"
template => "sncf_template.json"
template_name => "sncf"
document_id => "%{id}"
}
}

Hope this helps

Le dimanche 26 avril 2015 13:50:54 UTC+2, Rodger Moore a écrit :

Hi there again!

This problem is caused by, what I believe, a bug in Logstash or
Elasticsearch. I used a very small test csv file with only 1 or 2 records
per date. The default Logstash template creates 1 index per date. For some
reason the creation of indices goes wrong when it comes to field types and
very few records per index. After I changed the index creation template in
the output config to:

output {

elasticsearch {
protocol => "http"
index => "logstash-%{+YYYY.MM}"
}
}

thus creating only 1 index per month the problem with wrong field types
was gone. If the folks from Elastic want to reproduce this, I enclosed the
config files and test file. Changed status to solved.

Cheers,

Rodger.

Op zaterdag 25 april 2015 22:13:45 UTC+2 schreef Rodger Moore:

Hi there!

My question is fairly simple but I'm having trouble finding a solution.
I have a csv file containing Lat and Lon coordinates in separate fields
named "Latitude" and "Longitude". Most of the info I found on the net is
focussed on GeoIP (which is great functionality btw) but besides some
posts
https://groups.google.com/forum/#!topic/elasticsearch/QaI1fj74RlMin
Google Groups I failed finding a good tutorial for this use-case.

What is the simplest way of getting separate Long / Lat fields into a
geo_point and putting these coordinates on a Tile Map in Kibana 4 using the
default Logstash (mapping) - ES - Kibana settings? I am using logstash
1.4.2 | Elasticsearch 1.5.0. and Kibana 4.0.1.

Summary: --> csv containing Long / Lat in separate fields --> Logstash
--> ES --> Kibana4?

Any help very much appreciated!

Cheers,

Rodger

--
You received this message because you are subscribed to the Google Groups
"elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an
email to elasticsearc...@googlegroups.com <javascript:>.
To view this discussion on the web visit
https://groups.google.com/d/msgid/elasticsearch/61c9e345-c997-43ac-ab58-7c753fecf0f0%40googlegroups.com
https://groups.google.com/d/msgid/elasticsearch/61c9e345-c997-43ac-ab58-7c753fecf0f0%40googlegroups.com?utm_medium=email&utm_source=footer
.
For more options, visit https://groups.google.com/d/optout.

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/1b0acfd1-442a-45a2-9a67-e3df3572b6f3%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.