"name cannot be empty string" error if field name starts with

Although Elasticsearch 5 allows dots-in-field-names, the following error occured.

Is this spec or bug?

# curl -XPUT 'localhost:9200/twitter/tweet/1?pretty' -d'
> {
>     ".a" : "a"
> }
> '
{
  "error" : {
    "root_cause" : [
      {
        "type" : "mapper_parsing_exception",
        "reason" : "failed to parse"
      }
    ],
    "type" : "mapper_parsing_exception",
    "reason" : "failed to parse",
    "caused_by" : {
      "type" : "illegal_argument_exception",
      "reason" : "name cannot be empty string"
    }
  },
  "status" : 400
}

*Elasticsearch version
5.0.1

*JVM version
1.8.0_102

*OS version
CentOS 7.2

From the discussion here:

This looks like a side-effect of how dots in field names were implemented.

What's your exact use-case that's causing this issue?

In my use-case, our users freely send logs through fluentd.
So, this problem occurs.
But, I think that users can change field name.

This is slightly dangerous as a use case, I think.

Elasticsearch is generally fine with having lots of fields. Some features, like doc_values don't do well with sparse fields. If you disable doc_values on most fields (so no aggregation or sorting on them) then you are generally ok with sparse fields. But each field does have a non-trivial overhead. And adding new fields on the fly requests a cluster state update which can be time consuming. Having thousands of fields in an index doesn't tend to work well.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.