Hello Team,
I have one use case related to logstash filters
I need to parse nested json data with specific fields included and drop rest data
Data format:
{
"_index": "index-name",
"_source": {
"level": "INFO",
"params": {
"language": "id",
"ver": "1"
},
"Id": "some-id",
"logger": "RequestListener",
"httpRequest": {
"requestMethod": "GET",
"userAgent": "external-agent",
},
"message": "xyz-message",
"@timestamp ": "2018-08-21T19:18:30.143Z",
"headers": {
"x-forwarded-port": "port"
}
}
}
Say, in above logs below fields should be considered as whitelisted and rest should be dropped
level, httpRequest.requestMethod, message, params.language
I tried using kv filter with include_keys but no luck
Also prune filter does not include "[httpRequest][requestMethod]" as a field
Please let me know what could a miss here?
Thanks
Move the nested fields ([httpRequest][requestMethod]
and [params][language]
) into the top level, then use the prune filter.
1 Like
thanks alot @magnusbaeck
implemented it with mutate filter then applied prune white listing on those fields
If you get time please check below query
Hello Team,
Getting below issue while using cloud_watch plugin with logstash
Upon checking randomly on cloudwatch and kibana index some of the logs are missing
I am using logstash-input-cloudwatch plugin to pull logs from cloudwatch
Say for index mydata I am not able to find value xyz but the same is there on cloudwatch
Also the problem is that there is no parsefailed data
Data format: json
Any insight on this would be a great help.
Thanks
I am seeing drop in log ingestion from cloudwatch plugin, few of the logs are missing at ELK but available in source (cloud watch)
logs are ingested at average rate 6 mil/hour
I couldn't figure out a way to get to root cause of it
thanks.
system
(system)
Closed
September 20, 2018, 6:27pm
5
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.