Hi,
I configure ELK6.0 with filebeat to read logs files generated with log4j. I got below output
{
"_index": "filebeat-6.0.0",
"_type": "doc",
"_id": "GawR6YIBaCy-nPSp4SyK",
"_version": 1,
"_score": null,
"_source": {
"@timestamp": "2022-08-29T10:07:18.372Z",
"offset": 31456502,
"@version": "1",
"beat": {
"name": "C-CND1312WKH",
"hostname": "C-CND1312WKH",
"version": "6.0.0"
},
"host": "C-CND1312WKH",
"prospector": {
"type": "log"
},
"ERROR": "WARN",
"source": "C:\phoenix\logs\SPSServer.0.log",
"message": "{"host":"ME1","level":"WARN","log":{"classname":"org.apache.kafka.clients.NetworkClient:776","message":"[Producer clientId=producer-23] Connection to node -1 (localhost/127.0.0.1:9092) could not be established. Broker may not be available.","stacktrace":"","threadname":"kafka-producer-network-thread | producer-23"},"process":"processDisconnection","service":"","time":"2022-07-04T10:57:31.076Z","timezone":"UTC","type":"log"}",
"tags": [
"beats_input_codec_plain_applied",
"SPS"
]
},
"fields": {
"@timestamp": [
"2022-08-29T10:07:18.372Z"
]
},
"highlight": {
"tags.keyword": [
"@kibana-highlighted-field@SPS@/kibana-highlighted-field@"
],
"tags": [
"@kibana-highlighted-field@SPS@/kibana-highlighted-field@"
]
},
"sort": [
1661767638372
]
}
My requirement is to have inner json associated with message should be available for querying. I tried KQL, look for filebeat module,filebeat filter , logstash plugin. but didn't get any satisfactory options. please help. do i need to write some module/plugin to parse input as per my requirements?
Regards,
Lalit