Value too large to output (9971 bytes)! First 255 chars are

Hello All,

I got the error below while parsing through logstash.

[2022-04-14T16:46:53,545][WARN ][logstash.filters.grok ][main][71f87e3c7ed04afa020468f007ac1a64411b0298a6452730ffc54f56f8121e61] Timeout executing grok '[%{GREEDYDATA:Time}] [%{GREEDYDATA:Manage_server}] [%{WORD:Notification}] [%{GREEDYDATA:Logging}] [%{GREEDYDATA}: %{NUMBER:tid}] [%{GREEDYDATA}: %{GREEDYDATA:userId}] [%{GREEDYDATA}: %{GREEDYDATA:ecid}] [%{GREEDYDATA}: %{GREEDYDATA:APP}] [%{GREEDYDATA}: %{GREEDYDATA:partition-name}] [%{GREEDYDATA}: %{GREEDYDATA:tenant-name}] [%{GREEDYDATA}: %{GREEDYDATA:J2EE_APP_Name}] [%{GREEDYDATA}: %{GREEDYDATA:J2EE_MODULE_NAME}] [%{GREEDYDATA}: %{GREEDYDATA:WEBSERVICE_NAME}] [%{GREEDYDATA}: %{GREEDYDATA:WEBSERVICE_PORT_NAME}] [%{GREEDYDATA}: %{GREEDYDATA:oracle_wsm_policy_name}] [%{GREEDYDATA}: %{GREEDYDATA:WSM_RemoteAddress}] [%{GREEDYDATA}: %{GREEDYDATA:WSM_LogType}] [%{GREEDYDATA}: %{GREEDYDATA:WSM_ServiceID}] [%{GREEDYDATA}: %{GREEDYDATA:WSM_OperationName}] [[%{GREEDYDATA:XML}]]' against field 'message' with value 'Value too large to output (9971 bytes)! First 255 chars are: [2022-04-11T12:50:33.081+02:00] [*] [NOTIFICATION] [oracle.wsm.msg.logging] [tid: 146] [userId: ] [ecid: *] [APP: soa-infra] [partition-name: *] [tenant-name: *] [J2EE_APP.name'!

I have used instead of [[%{GREEDYDATA:XML}]] is [[%{NOTSPACE:XML}]] and also [[%{DATA:XML}]] . Both has not worked for me.

Indeed my XML details are huge. But if I do it alone [[%{GREEDYDATA:XML}]] parsing is happening fine. But all of this together it is not working.

Can you please guide on this?
Thanks

Read this.

grok timeouts typically occur when a pattern does not match, because grok then spends a lot of time back-tracking and trying different possible matches for things like GREEDYDATA.

Consider

[%{GREEDYDATA}: %{NUMBER:tid}] [%{GREEDYDATA}: %{GREEDYDATA:userId}] [%{GREEDYDATA}: %{GREEDYDATA:ecid}]

Can you change that to

[tid: %{NUMBER:tid}] [userId: %{GREEDYDATA:userId}] [ecid: %{GREEDYDATA:ecid}]

Similarly for other fields that have fixed names. If you can then failure will be much faster and you will be less likely to have timeouts.

Worst case -- use a ruby filter to scan for fields surrounded by square brackets and then parse all the colon delimited entries :smiley:

1 Like

Thanks a lot @Badger . Now time out is not coming and it gives me a result. Lets monitor for few days, how it will go!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.