Failing to index geo_shape data into Elasticsearch from Logstash (nil values)

I have Ruby code in Logstash that parses a CSV file and reads the data. Some fields of the data can be empty (e.g. ,,,). When the data is empty, Ruby's CSV parse converts the data to an empty string, (e.g. ""). I also have an Index that defines these fields as a "geo_shape". So, when Elasticsearch receives the empty string, "", it fails to parse it with the error:

"error" => "type" => "mapper_parsing_exception" => "failed to parse field [parametrics.location (L)] of type [geo_shape], "caused by" => "type" => "parse_exception", "expected word but found: END-OF-STREAM"}}}}

I was able to verify using Kibana > DevTools that I could create the index and then POST data to it IF the value was changed from an empty string, "" to NULL.

BUT, I can't figure out how to tell Logstash to set the NULL value from within the Ruby Code section. Here's what I've tried.

Index:

PUT data
{
  "mappings": {
     "properties": {
         "parametrics": {
             "properties": {
                  "location (L)": {
                      "type": "geo_shape"
                  }
              }
         }
      }
   }
}

Snippet of the Logstash Ruby Filter (NOTE: the actual Ruby code prior to this snippet correctly parses the CSV file because when the value is NOT empty, it correctly populates the data fields, the problem I am trying to resolve only exists when the value is "nil", empty, not populated in the CSV):

...
ruby {
   code => '
      event.get("parametrics", parametrics)
      
      if parametrics["location (L)]" == ""
           event.set("parametrics[location (L)", nil)
     end
  '
   }
...

Ruby debug output:

"parametrics" => {
    "location (L)" => ""
}

Ultimately, what I'm trying to do is figure out how to populate the resulting JSON document that gets sent to Elasticsearch to have NULL in the field that is empty. So, when the value that is being parsed is something like, "LINESTRING(lon1 lat1, lon2 lat2, lon3 lat3)", the data is correctly parsed into Elasticsearch as geo_shape (WKT) format.

Thanks

I would expect you to be getting _rubyexception with that. The event.get("parametrics", parametrics) should be parametrics = event.get("parametrics") and

event.set("parametrics[location (L)", nil)

should be

event.set("[parametrics][location (L)]", nil)

Thanks @Badger, I had to type this up by hand rather than copy and paste. Your suggestions worked. As an alternative, I also just removed the key/value from a hash.

Scott

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.