Logstash filters for log file which is included some raw data and json data

Hi Elastic team, I'm new to ELK, I'm trying to find out the filters for below log file but I'm not able to find the proper Logstash filter for below data

2023-01-19 15:38:31 INFO VCIPDownstreamController:138 - {"timestamp":"2023-19-01T15:38:31","module":"VCIPDownstreamInitRequest","requestURI":"http://sample.url.com/sam_bnk_vcip_ws/v1/bbl/saveCustomerVCIPDetails","appRefNum":"Test29Dec0006","requestDateTime":"2023-19-01T15:38:31","responseDataTime":"","responseTime":"","requestData":"0dJiseXXhoHZJSoEeLqGkQ\u003d\u003d","responseData":"","responseCode":"","accessLog":"{"remoteHost":"192.11.13.133","remoteAddr":"192.11.13.133","localaadr":"192.11.8.118","x-forwarded-for":["152.170.4.37"],"x-forwarded-proto":["https"],"x-forwarded-port":["443"],"host":["sample.url.com"],"x-amzn-trace-id":["Root\u003d1-63c9169e-01c9501a04f1239777d95ae7"],"content-length":["121799"],"authorization":["Basic VmNpcHVzZXI6VmNpcEtMQHA123\u003d\u003d"],"content-type":["application\/json"],"user-agent":["PostmanRuntime\/7.30.0"],"accept":["\/"],"postman-token":["7554nfbf-1d8f-4793-8258-d27c2ccf4cae"],"accept-encoding":["gzip, deflate, br"],"cookie":["AWSALB\u003d1Apfla0u4SSa7aQ3vAIGAv6wKbc5KH2eUhQ1lXCVr8SIzswx2vAWIK3e1gzob2FNspNYJA+aOjtwnXQoTFbQAgE0duYlEyGHIYk\/b\/byEdTGu0yeuCZ5roEM9706; AWSALBCORS\u003d1Apfla0u4SSa7aQ3vAIGAv6wKbc5KH2eUhQ1lXCVr8SIzswx2vAWIK3e1gzob2FNspNYJA+aOjtwnXQoTFbQAgE0duYlEyGHIYk\/b\/byEdTGu0yeuCZ5roEM9706"]}"}**

I have used below grok pattern

%{TIMESTAMP_ISO8601: date} %{LOGLEVEL:log-level} %{DATA:data}:%{INT:int} - %{GREEDYDATA:message}

but my team wants that GREEDYDATA message also in a each separated filed in kabana because it having lot of key value pair of json data.

kindly help me on this as soon as possible Thank you in advance.

Have you tried using a json filter?

Yes I'm tried with below filter but total data not showing in kibana

filter {
		grok { match => { "message" => "%{TIMESTAMP_ISO8601:timestamp} %{LOGLEVEL:log-level} %{DATA:data}:%{INT:int} - %{GREEDYDATA:json_data}" }
			}
	}
filter {
  json{
    source => "json_data"
    target => "parsed_json"
  } 
}

actually in my logfile I have another json type of key value is there, in below log file you can see that in the 6th line it is started as "accesslog" key

pls look into that and help me out from that.

023-01-19 15:38:31 INFO VCIPDownstreamController:138 - {"timestamp":"2023-19-01T15:38:31","module":"VCIPDownstreamInitRequest","requestURI":"http://sample.url.com/sam_bnk_vcip_ws/v1/bbl/saveCustomerVCIPDetails","appRefNum":"Test29Dec0006","requestDateTime":"2023-19-01T15:38:31","responseDataTime":"","responseTime":"","requestData":"0dJiseXXhoHZJSoEeLqGkQ\u003d\u003d","responseData":"","responseCode":"","accessLog":"{"remoteHost":"192.11.13.133","remoteAddr":"192.11.13.133","localaadr":"192.11.8.118","x-forwarded-for":["152.170.4.37"],"x-forwarded-proto":["https"],"x-forwarded-port":["443"],"host":["sample.url.com"],"x-amzn-trace-id":["Root\u003d1-63c9169e-01c9501a04f1239777d95ae7"],"content-length":["121799"],"authorization":["Basic VmNpcHVzZXI6VmNpcEtMQHA123\u003d\u003d"],"content-type":["application\/json"],"user-agent":["PostmanRuntime\/7.30.0"],"accept":["\/"],"postman-token":["7554nfbf-1d8f-4793-8258-d27c2ccf4cae"],"accept-encoding":["gzip, deflate, br"],"cookie":["AWSALB\u003d1Apfla0u4SSa7aQ3vAIGAv6wKbc5KH2eUhQ1lXCVr8SIzswx2vAWIK3e1gzob2FNspNYJA+aOjtwnXQoTFbQAgE0duYlEyGHIYk\/b\/byEdTGu0yeuCZ5roEM9706; AWSALBCORS\u003d1Apfla0u4SSa7aQ3vAIGAv6wKbc5KH2eUhQ1lXCVr8SIzswx2vAWIK3e1gzob2FNspNYJA+aOjtwnXQoTFbQAgE0duYlEyGHIYk\/b\/byEdTGu0yeuCZ5roEM9706"]}"}

i'm struggling with that "accesslog" key valude data, i'm not able to find the perfect filter to show all variables and all data in kibana

pls help me out from this

Please edit your post, select the configuration and click on </> in the toolbar above the edit pane, then do the same for the log entry. That will change the format from

filter {
grok { match => { "message" => "%{TIMESTAMP_ISO8601:timestamp} %{LOGLEVEL:log-level} %{DATA:data}:%{INT:int} - %{GREEDYDATA:json_data}" }
}
}

to

filter {
    grok { match => { "message" => "%{TIMESTAMP_ISO8601:timestamp} %{LOGLEVEL:log-level} %{DATA:data}:%{INT:int} - %{GREEDYDATA:json_data}" }
    }
}

and prevent the forum software from consuming parts of the message as formatting.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.