How to parse json fields from a log which have multiple event but without delimiter?
The logs are cloudwatch logs streamed to kinesis firehose and stored in s3 which is streamed to logstash
Are you saying you have multiple JSON objects on one event? Like this:
{ "foo": 1 } { "bar": 2 }
If so, neither a json filter nor a json codec will not parse it correctly. Although interestingly (at least to me) one only parses foo and the other only parses bar!
I think you would have to look at writing a custom parser in ruby.
Yes! logs like this
{"key1":"value","key2":"value","key3":["key4":'value","key4":"value"]}{"key1":"value","key2":"value","key3":["key4":'value","key4":"value"]} {"key1":"value","key2":"value","key3":["key4":'value","key4":"value"]}{"key1":"value","key2":"value","key3":["key4":'value","key4":"value"]}
Using Grok can we define entire pattern and define it occurrence
Is that one event or four? If it is 4 then a json filter or codec will work. If it is one then I do not see an alternative to a custom parser.
That does not look like valid JSON so a custom parser is probably required.
Single Event . Agreed! Will work on custom parser
I tried to add new line between json event like below and in o/p section mentioned codec=>json_lines . But each json messages are not getting spilled as single event .Any ideas on this?
mutate {
gsub => ["message","}{","}\n{"]
}
}
This worked for me
mutate {
gsub => ["message","}{","}FORSPLITLOGSTASH{"]
}
split{
field => message
terminator => "FORSPLITLOGSTASH"
}
json{
source => message
}
}
Thank you all !
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.