ELK6.0 with filebeat


I configure ELK6.0 with filebeat to read logs files generated with log4j. I got below output

"_index": "filebeat-6.0.0",
"_type": "doc",
"_id": "GawR6YIBaCy-nPSp4SyK",
"_version": 1,
"_score": null,
"_source": {
"@timestamp": "2022-08-29T10:07:18.372Z",
"offset": 31456502,
"@version": "1",
"beat": {
"name": "C-CND1312WKH",
"hostname": "C-CND1312WKH",
"version": "6.0.0"
"host": "C-CND1312WKH",
"prospector": {
"type": "log"
"source": "C:\phoenix\logs\SPSServer.0.log",
"message": "{"host":"ME1","level":"WARN","log":{"classname":"org.apache.kafka.clients.NetworkClient:776","message":"[Producer clientId=producer-23] Connection to node -1 (localhost/ could not be established. Broker may not be available.","stacktrace":"","threadname":"kafka-producer-network-thread | producer-23"},"process":"processDisconnection","service":"","time":"2022-07-04T10:57:31.076Z","timezone":"UTC","type":"log"}",
"tags": [
"fields": {
"@timestamp": [
"highlight": {
"tags.keyword": [
"tags": [
"sort": [

My requirement is to have inner json associated with message should be available for querying. I tried KQL, look for filebeat module,filebeat filter , logstash plugin. but didn't get any satisfactory options. please help. do i need to write some module/plugin to parse input as per my requirements?


Hi the 6.X releases are past EOL, we advise that you update to a more current release.

I believe that the Decode JSON fields | Filebeat Reference [8.4] | Elastic processor may be what you're looking for
The 6.0 docs for it are Decode JSON fields | Filebeat Reference [6.0] | Elastic

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.