Help on mapping for "catch-all" fields

In a medium company, with 12 developers, I setup a super logger with logstash + elasticsearch where anyone can log anything.

Log documents are like this:

{"date" : "....", "data" : {"myfield1":"hello", "myfield2": 3, "myfield3" : true ....}, "name" : "click"...}

Where data is the "catch-all" field, I mean myfield1 can be anything, posted by the developer. Problems appear if a dev A posts myfield1 as a string, and dev B posts it as a number, I got the famous:

{"type"=>"mapper_parsing_exception", "reason"=>"object mapping for [event.data.tags] tried to parse field [tags] as object, but found a concrete value"}

What could be the best way to handle a such situation with logstash and elasticsearch ? I am thinking about 3 solutions:

  • Force dev to respect a schema with proper naming
  • Force dev to use a naming convention such as "myfield1_{TYPE}" as type {TYPE}
  • Transform field name according type to "myfield1_{TYPE}"

Thanks,

Mapping does not accept objects and primitives (string/number) at the same field.

In your example

{
  "event" : {
    "data" : {
      "tags" : "foobar"
    }
  }
}

and

{
  "event" : {
    "data" : {
      "tags" : {
          "foo" : "bar"
      }
    }
  }
}

Force your developers to send either objects or primitives to a field.

Yep i understand, thanks you.