Timeout Error in logstash

", "_id"=>"AV0xCZcO0YIn9Z-Da2uz", "status"=>400, "error"=>{"type"=>"illegal_argu
ment_exception", "reason"=>"Document contains at least one immense term in field
="prd_ErrorMessage" (whose UTF8 encoding is longer than the max length 32766),
all of which were skipped. Please correct the analyzer to not produce such ter
ms. The prefix of the first immense term is: '[10, 83, 121, 115, 116, 101, 109,
32, 101, 120, 99, 101, 112, 116, 105, 111, 110, 32, 10, 91, 69, 88, 67, 69, 80,
84, 73, 79, 78, 93]...', original message: bytes can be at most 32766 in length
; got 51151", "caused_by"=>{"type"=>"max_bytes_length_exceeded_exception", "reas
on"=>"bytes can be at most 32766 in length; got 51151"}}}}}
[2017-07-11T04:45:01,994][ERROR][logstash.instrument.periodicpoller.jvm] Periodi
cPoller: exception {:poller=>#<LogStash::Instrument::PeriodicPoller::JVM:0x5284c
1c5 @task=#<Concurrent
, :excep
tion=>#<Concurrent::TimeoutError: Concurrent::TimeoutError>, :executed_at=>2017-
07-11 04:45:01 -0500}
[2017-07-11T04:45:02,887][ERROR][logstash.filters.grok ] Error while attempti
ng to check/cancel excessively long grok patterns {:message=>"Mutex relocking by
same thread", :

[2017-07-11T04:45:02,906][WARN ][logstash.filters.grok ] Timeout executing gr
ok '(

1.I am loading data from log files in that prd_ErrorMessage field is loading data which is longer than the max length.

2.I applied "ignore_above": 256 which resolves my timeout error but I don't want to loose data.
ignore_above": 256 will ignore the whole record.

3.I can see my CPU % is also reaches to 100%.
4.And while loading dashboard ion kibana I am facing below error.
RemoteTransportException[[Corona][143.22.209.122:9300][indices:data/read/search[
phase/query]]]; nested: CircuitBreakingException[[request] Data too large, data
for [<reused_arrays>] would be larger than limit of [381891379/364.1mb]];
Caused by: CircuitBreakingException[[request] Data too large, data for [<reused_
arrays>] would be larger than limit of [381891379/364.1mb]]

Please suggest some solution for this issue.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.