Regarding to logstash performance limit

Hi experts,

I have met a logstash performance limit issue, when I parse my log, logstash report this error log:

`

[2019-09-03T14:21:49,106][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash-2019.09.03", :_type=>"_doc", :_routing=>nil}, #LogStash::Event:0x1b6c342a], :response=>{"index"=>{"_index"=>"logstash-2019.09.03", "_type"=>"_doc", "_id"=>"4_PJ9WwB6iwMcN-ZT20T", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"Limit of total fields [1000] in index [logstash-2019.09.03] has been exceeded"}}}}

`

That block the parsed data that can't be sent to elasticsearch. From the content, it seems a total fields limit, how can I enlarge this parameter?Or are there some method can solve this issue? Thank you.

That is an Elasticsearch limit and not managed through Logstash. How come you have so many fields?

The log is included a lot of data that I have added a lot of index to map them, therefore the index count is huge, are there some method can enlarge the mapping.total_fields.limit ?

I have retrieved the similar issue, here is a method to enlarge it, i will try to do it later.

curl -XPUT localhost:9200/_template/twothousandfieldslimit -d '
{
    "order" : 1,
    "template" : "twitter*",
    "settings" : {
        "index" : {
            "mapping.total_fields.limit" : "2000"
         }
    }
}'

The solution is:

curl -XPUT http://localhost:9200/lsh_nokia_nokiasbc-2019.09.*/_settings/?pretty -H 'Content-Type: application/json' -d ' { "index.mapping.total_fields.limit" : "2000" } '

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.