Hi there,i get this error when logstash try to write in elasticsearch.it create the index but there is no data available on elasticsearch.
Document contains at least one immense term in field="errormsg.keyword" (whose UTF8 encoding is longer than the max length 32766
this is my pipeline.conf.
input {
file {
path => "c:/logstash.log"
start_position => "beginning"
codec => multiline {
pattern => "^%{TIMESTAMP_ISO8601}"
negate => true
what => "previous"
}
}
}
filter {
grok{
match => { "message" => "%{TIME:timestamp} %{LOGLEVEL:LEVEL} %{GREEDYDATA:errormsg}" }
}
}
output {
if "ERROR" in [LEVEL]
{
elasticsearch {
hosts=>"localhost:9200"
}
}
stdout { codec => rubydebug }
}
thanks for help.