I apologise if this is a silly question. I have been reading the documentation and trying to understand how to use the ingest pipelines to get my data into fields. I have had some success, However I have become stuck with some types of custom logs and was wondering what I was missing.
As an example I am using a dissect pattern in my ingest pipeline such as
%{@timestamp} [%{module}] %{log.level} %{message->}
This will process the following log output.
2022-12-16 04:25:24-0800 [-] LOAD - POST api call - https:///utilApp/webapi/secured//update?processcd=wady=23 20
2022-12-16 04:25:24-0800 [HTTP11ClientProtocol (TLSMemoryBIOProtocol),client] LOAD - post call response - b status success 20
That works well as there is no real data I need to separate into explicit fields and can just pass it to the message field.
the issue I am faced with is that same log file contains some interesting data that I would lie to put into ecs fields. I am however unsure in Elastic if I can treat certain log content differently to be able to process this.
2022-12-16 04:25:34-0800 [-] LOAD - Updating load specs with - {'hostName': 'bull.com', 'cpu1': 0.0, 'cpu5': 0.05, 'cpu15': 0.05, 'cpu
Utilization': 0.6, 'totalMemory': '9.59', 'totalCpu': '8', 'memoryUtilization': '1.12', 'rollover': 'Y', 'dcLocation': 'V'} 20
I understand I may be able to do it with Logstash, but really wanted to stay with fleet and the elastic agent to process my data.