Error in parse Geo_point in logstash


(yaser no) #1

i send this json to logstash :

{ "location" :45.5 }

this is my conf.d/config file ::

input {
rabbitmq {
user => 'user'
password => ''
exchange => '
'
queue => '****'
durable => true
host => 'ip'
subscription_retry_interval_seconds => 5
codec => 'json'
type =>'test_type'
}
}
output {
elasticsearch {
hosts => ["192.168.1.6:9200","192.168.1.7:9200"]
codec => 'json'
index => "index_test"
}
}

this is the error log of logstash ::

[logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>
["index", {:_id=>nil, :_index=>"index_test", :_type=>"doc", :_routing=>nil}, #LogStash::Event:0x4da5b444], :response=>{"index"=>{"_index"=>"index_test", "_type"=>"doc", "_id"=>"ZMQfZWABq4vMDUHKdlPb", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse", "caused_by"=>{"type"=>"parse_exception", "reason"=>"geo_point expected"}}}}}

and this is my mapping ::

PUT index_test
{
"mappings": {
"test_type": {
"properties": {
"location": {
"type": "geo_point"
}
}
}
}
}

can anyone help ?


(Mark Walkom) #2

A location is a latitude and a longitude, you only have one of those.


(yaser no) #3

what you mean? if you speak about json data that i send , i check it in multiple models like ::

    \"location\" : {
        \"latitude\" : 41,
        \"longitude\" : 71

or ::

    \"location\" : {41, 71}

i don't know what format work??

can you please explain completely ?


(Mark Walkom) #4

That is not a proper geopoint, it is only one value.


(yaser no) #5

yes ofcourse , but i sent in in this mode :: "location" : {45,51} but logstash return this error in log :

`

[2017-12-18T08:56:02,247][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch.
{:status=>400, :action=>["index", {:_id=>nil, :_index=>"index_test", :_type=>"doc", :_routing=>nil}, #LogStash::Event:0x1546c159], :response=>{"index"=>{"_index"=>"index_test", "_type"=>"doc", "_id"=>"4cQWaGABq4vMDUHKBmb8", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"illegal latitude value [269.9999986588955] for location"}}}}}

`


(Magnus Bäck) #6

yes ofcourse , but i sent in in this mode :: "location" : {45,51} but logstash return this error in log :

{45,51} isn't valid JSON so I'm not sure what you mean.


(yaser no) #7

i mean , all other setting is ok ,? and my json format is the main problem ?? because i test many json format and they aren't work!


(Mark Walkom) #8

Please show an example document.


(yaser no) #9

"{"Data" : "{data}","Snr" : {snr},"SequenceNumber" : {seqNumber},"DeviceID" : "{device}","StationID" : "{station}","RSSI" : {rssi},"Latitude" : {lat},"Longitude" :{lng},"AvgSnr" : {avgSnr},"Duplicate" : "{duplicate}"}",

this send to rabbitmq as my json content

tnx , here it is: i want to send "Latitude" and "Longitude" to save geo_point field but i can't.
and another thing that must know , all {rssi} , {snr} , {lat} , {lng} and .... , replace with real values.


(Magnus Bäck) #10

Please show us what you actually send to Elasticsearch. You can temporarily replace your elasticsearch output with a stdout { codec => rubydebug } output to dump that raw events to the log.


(yaser no) #11

hi again and thank so much about your answers ,
this is rubydebug ::

{
          "type" => "my_type",
      "@version" => "1",
    "@timestamp" => 2017-12-18T19:29:56.056Z,
      "location" => "1,5"
}

and still logstash log this ::

[2017-12-18T23:04:12,503][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"my_index", :_type=>"doc", :_routing=>nil}, #LogStash::Event:0x1549ea87], :response=>{"index"=>{"_index"=>"my_index", "_type"=>"doc", "_id"=>"nmMea2ABSZoa10DTjIn5", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"Rejecting mapping update to [my_index] as the final mapping would have more than 1 type: [doc, my_type]"}}}}


(yaser no) #12

solved ::

by this configuration ::

i add document_type => "type1" and it worked :blush:

input {
  rabbitmq {
    user => "user"
    password => "pass"
    exchange => "exc"
    queue => "logstash"
    durable => true
    host => "ip"
    subscription_retry_interval_seconds => 5
    codec => "json"
  }
}
output {
  elasticsearch {
    hosts => ["192.168.1.6:9200","192.168.1.7:9200"]
    codec => "json"
    index => "my_index1"
    document_type => "type1"
   }
file {
              path => "/Log/rabbitmq_debug_events-%{+YYYY-MM-dd}"
             codec => rubydebug
             }
}

thank you elasticsearch team


(system) #13

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.