It's not an issue IMO but just a default configuration.
FYI here is a sample config file I just used to parse some CSV data:
input {
stdin {}
}
filter {
csv {
separator => ";"
columns => [
"id","name","slug","uic","uic8_sncf","longitude","latitude",
"parent_station_id","is_city","country",
"is_main_station","time_zone","is_suggestable","sncf_id",
"sncf_is_enabled","idtgv_id","idtgv_is_enabled",
"db_id","db_is_enabled","idbus_id","idbus_is_enabled","ouigo_id",
"ouigo_is_enabled",
"trenitalia_id","trenitalia_is_enabled","ntv_id","ntv_is_enabled",
"info_fr",
"info_en","info_de","info_it","same_as"
]
}
if [id] == "id" {
drop { }
} else {
mutate {
convert => { "longitude" => "float" }
convert => { "latitude" => "float" }
}
mutate {
rename => {
"longitude" => "[location][lon]"
"latitude" => "[location][lat]"
}
}
mutate {
remove_field => [ "message", "host", "@timestamp", "@version" ]
}
}
}
output {
stdout { codec => rubydebug }
stdout { codec => dots }
elasticsearch {
protocol => "http"
host => "localhost"
index => "sncf"
index_type => "gare"
template => "sncf_template.json"
template_name => "sncf"
document_id => "%{id}"
}
}
Hope this helps
Le dimanche 26 avril 2015 13:50:54 UTC+2, Rodger Moore a Ă©crit :
Hi there again!
This problem is caused by, what I believe, a bug in Logstash or
Elasticsearch. I used a very small test csv file with only 1 or 2 records
per date. The default Logstash template creates 1 index per date. For some
reason the creation of indices goes wrong when it comes to field types and
very few records per index. After I changed the index creation template in
the output config to:
output {
elasticsearch {
protocol => "http"
index => "logstash-%{+YYYY.MM}"
}
}
thus creating only 1 index per month the problem with wrong field types
was gone. If the folks from Elastic want to reproduce this, I enclosed the
config files and test file. Changed status to solved.
Cheers,
Rodger.
Op zaterdag 25 april 2015 22:13:45 UTC+2 schreef Rodger Moore:
Hi there!
My question is fairly simple but I'm having trouble finding a solution. I
have a csv file containing Lat and Lon coordinates in separate fields named
"Latitude" and "Longitude". Most of the info I found on the net is focussed
on GeoIP (which is great functionality btw) but besides some posts
https://groups.google.com/forum/#!topic/elasticsearch/QaI1fj74RlMin
Google Groups I failed finding a good tutorial for this use-case.
What is the simplest way of getting separate Long / Lat fields into a
geo_point and putting these coordinates on a Tile Map in Kibana 4 using the
default Logstash (mapping) - ES - Kibana settings? I am using logstash
1.4.2 | Elasticsearch 1.5.0. and Kibana 4.0.1.
Summary: --> csv containing Long / Lat in separate fields --> Logstash
--> ES --> Kibana4?
Any help very much appreciated!
Cheers,
Rodger
--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/df9e85b7-35b8-4f3e-b361-fe8d2c33cac9%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.