Mapping geo_point data type in logstash


(Chinmoy Das) #1

In my input JSON to logstash( input from a postgreSQL query), I have a string field containing the geolocation(example: 22.572645, 88.363892). The field name is payergeocode. When this data is going to Kibana through Elasticsearch, it is not getting any geo_point data type. Please suggest what do I need to write in logstash.conf that will do the conversion.

I have tried the following and it did not work.
filter{
if [type] == "cases"{
mutate{
convert => {
"payergeocode" => "geo_point"
}
}
}
}

Logstash version is 6.4.2. Elasticsearch version is 6.4.2


(Christian Dahlqvist) #2

Converting fields in a mutate block just changes how they are represented in the JSON document sent to Elasticsearch. As geo_point fields are not represented in any special way in the JSON document, but rather interpreted in Elasticsearch, you can not cast this in Logstash. You instead need to use an index template that contains the correct mapping.


(Chinmoy Das) #3

Thanks for the quick reply.

I have created a mapping as below:
http://172.18.17.207:9200/_template/geotypetemplate_2
{
"index_patterns" : ["case*"],
"mappings": {
"doc": {
"properties": {
"case_id": {
"type": "keyword"
},
"crtn_ts": {
"type": "date"
},
"payergeocode": {
"type": "geo_point"
}
}
}
}
}

My SQL query is: select case_id, crtn_ts, payergeocode from frm_gateway.frm_case

But when I start logstash, Elasticsearch gives the following error:
[2018-11-07T14:35:04,766][DEBUG][o.e.a.b.TransportShardBulkAction] [casegs][0] failed to execute bulk item (update) BulkShardRequest [[casegs][0]] containing [update {[casegs][caseg][ISBIC05112018100001], doc_as_upsert[true], doc[index {[casegs][caseg][ISBIC05112018100001], source[{"case_id":"ISBIC05112018100001","@timestamp":"2018-11-07T09:05:01.233Z","payergeocode":"22.572645,88.363892","type":"cases","@version":"1","crtn_ts":"2018-11-05T07:15:39.253Z"}]}], scripted_upsert[false], detect_noop[true]}]
java.lang.IllegalArgumentException: Rejecting mapping update to [casegs] as the final mapping would have more than 1 type: [caseg, doc]

Output in logstash.conf is as below:
output {
stdout { codec => rubydebug }
if [type] == "cases"{
elasticsearch {
index => "casegs"
document_type => "caseg"
document_id => "%{case_id}"
doc_as_upsert => true
action => "update"
hosts => ["172.18.17.207:9200"]
}
}


(system) #4

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.