reason"=>"Limit of total fields [1000] in index

I am getting below error. but found the events in kibana discovery. Increesed the fileds limit by below.

PUT issqnet-2020.05.26/_settings
{
  "index.mapping.total_fields.limit": 10000
}
```````````````````
incresed and restarted logstash, still issue facing.
````````````````````
[2020-05-26T13:19:26,177][WARN ][logstash.outputs.elasticsearch][main][220c76045b556cf67d246e54931c092b729b0e8dcc83cd8283a28d6d531fc2bd] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"%{[@metadata][target_index]}", :routing=>nil, :_type=>"_doc"}, #<LogStash::Event:0x7c80078e>], :response=>{"index"=>{"_index"=>"%{[@metadata][target_index]}", "_type"=>"_doc", "_id"=>"xW2jUnIB3QufrXcV36cQ", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"Limit of total fields [1000] in index [%{[@metadata][target_index]}] has been exceeded"}}}}
````````````````````

It looks to me like the sprintf reference in the index option of the elasticsearch output did not get substituted, so you are not writing to the index that you changed the settings on.

issqnet-2020.05.26 index created docs are adding. I am not getting what you told. Can you please explain some more details.

elasticsearch is complaining that there are 1000 fields in an index called "%{[@metadata][target_index]}". If you are using

index => "%{[@metadata][target_index]}"

in an elasticsearch output in logstash then logstash does the substitution. The only time that elasticsearch would ever see the index name "%{[@metadata][target_index]}" would be if the [@metadata][target_index] field does not exist on some events.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.