Hi experts,
I have met a logstash performance limit issue, when I parse my log, logstash report this error log:
`
[2019-09-03T14:21:49,106][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash-2019.09.03", :_type=>"_doc", :_routing=>nil}, #LogStash::Event:0x1b6c342a], :response=>{"index"=>{"_index"=>"logstash-2019.09.03", "_type"=>"_doc", "_id"=>"4_PJ9WwB6iwMcN-ZT20T", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"Limit of total fields [1000] in index [logstash-2019.09.03] has been exceeded"}}}}
`
That block the parsed data that can't be sent to elasticsearch. From the content, it seems a total fields limit, how can I enlarge this parameter?Or are there some method can solve this issue? Thank you.