Document contains at least one immense term in field="errormsg.keyword" (whose UTF8 encoding is longer than the max length 32766

Hi there,i get this error when logstash try to write in elasticsearch.it create the index but there is no data available on elasticsearch.

Document contains at least one immense term in field="errormsg.keyword" (whose UTF8 encoding is longer than the max length 32766

this is my pipeline.conf.

input {
        file {
        path => "c:/logstash.log"
        start_position => "beginning"
        codec => multiline {
         pattern => "^%{TIMESTAMP_ISO8601}"
         negate => true
         what => "previous"
   }
        }

}
filter {
grok{
	  match => { "message" => "%{TIME:timestamp} %{LOGLEVEL:LEVEL} %{GREEDYDATA:errormsg}" }
	}
}
output {
if	"ERROR" in [LEVEL]
{
elasticsearch {
  hosts=>"localhost:9200"
  }
  }
stdout { codec => rubydebug }
}

thanks for help.

Hey,

you are trying to index the errormsg as a keyword field, which does not make much sense, because that is a field only used for full-text search, but not for aggregations or similar. If you are processing long multi line errors, the size of this can exceed 32kb, which results in this error.

You should change the mapping of that field to be only text and not have a keyword multi field in it.

--Alex

Thank you @spinscale for your answer.yes i am processing long multiline errors in the field errormsg .
So,where i can change the mapping of that field to be only text?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.