Hi guys !
I defer to you because I encounter a problem concerning the configuration of the pattern to aggregate several lines of logs on the same document.
Here is an excerpt from my log:
2023-08-17 16:13:15.389 |CB RCPD| [INFO] com.payxpert.cbrcpd.cb2a.certified.parameterdownloading.DialogContext:91 - [/] - f5a1f043-b97f-4ef8-977d-3bcec91d2ab2 Outgoing: <?xml version="1.0" encoding="UTF-8" standalone="no"?><isomsg mti="0854">
<field id="11" value="075502"/>
<field id="24" value="860"/>
<field id="39" value="0000"/>
<field id="46">
<subfield id="df54" value="31"/>
</field>
</isomsg>
2023-08-17 16:13:15.391 |CB RCPD| [INFO] com.payxpert.cbrcpd.cbcom.certified.TlcTlpCBCOMChannel:91 - [/] - Outgoing: 0854002001000204000007550208603030303005DF54000131
2023-08-17 16:13:15.392 |CB RCPD| [INFO] com.payxpert.cbrcpd.cb2a.certified.parameterdownloading.Cb2aStateMachineRunner:105 - [/] - State transition: AcceptCloseDialogState -> FinalStateDialogClosedState
I would like the lines above to be sent in 3 documents.
Here is my filebeat.yml side configuration for the section we are interested in :
- type: filestream
# Unique ID among all inputs, an ID is required.
id: cb-rcpd-input
# Change to true to enable this input configuration.
enabled: true
# Paths that should be crawled and fetched. Glob based paths.
paths:
- /var/log/cb-rcpd/cb-rcpd.log
parsers:
- multiline:
type: pattern
pattern: '^[0-9]{4}-[0-9]{2}-[0-9]{2}'
negate: false
match: after
For information, the logs are sent to elastic and processed by a grok pattern which only processes lines of type:
2023-08-17 16:13:15.391 |CB RCPD| [INFO] com.payxpert.cbrcpd.cbcom.certified.TlcTlpCBCOMChannel:91 - [/] - ....
Lines starting with something other than a date are therefore dropped.
And this is where I have a problem with my filebeat pattern :
2023-08-18T15:58:33.811Z WARN [elasticsearch] elasticsearch/client.go:414 Cannot index event publisher.Event{Content:beat.Event{Timestamp:time.Date(2023, time.August, 18, 15, 58, 32, 785557405, time.Local), Meta:null, Fields:{"agent":{"ephemeral_id":"fc9bb304-ad55-4a62-93e4-063fea737860","hostname":"mpads01","id":"195e33cf-1f2d-4e4b-9f4d-80ba8d1b22b5","name":"mpads01","type":"filebeat","version":"7.17.6"},"ecs":{"version":"1.12.0"},"fields":{"env":"staging"},"host":{"architecture":"x86_64","containerized":false,"hostname":"mpads01","id":"0682dedde5e545d3a97e3b32bcdacd55","ip":["10.10.11.170"],"mac":["00:16:3e:9c:40:ad"],"name":"mpads01","os":{"codename":"bookworm","family":"debian","kernel":"6.1.0-11-amd64","name":"Debian GNU/Linux","platform":"debian","type":"linux","version":"12 (bookworm)"}},"input":{"type":"filestream"},"log":{"file":{"path":"/var/log/cb-rcpd/cb-rcpd.log"},"offset":460216},"message":"\u003c/isomsg\u003e"}, Private:(*input_logfile.updateOp)(0xc0016312c0), TimeSeries:false}, Flags:0x1, Cache:publisher.EventCache{m:common.MapStr(nil)}} (status=400): {"type":"illegal_argument_exception","reason":"Provided Grok expressions do not match field value: [</isomsg>]"}, dropping event!
It's like it doesn't match the pattern and therefore sends everything to elastic which causes the error. Then grok drop the lines not starting with dates, which is normal.
Thank you very much for your help or a lead.
Maxence