Parse cloud flare logs in json format with ELK

Hi,
I'm trying to create a logstash filter to parse the cloud flare json logs like this

{"ClientIP":"x.x.x.x","ClientRequestHost":"www.test.org","ClientRequestMethod":"POST","ClientRequestProtocol":"HTTP/2","ClientRequestReferer":"","ClientRequestURI":"/index.html","ClientRequestUserAgent":"Go-http-client/2.0","ClientSrcPort":59876,"EdgeEndTimestamp":1533254858064000000,"EdgeServerIP":"","EdgeStartTimestamp":1533254857428999936,"OriginIP":"y.y.y.y","OriginResponseStatus":204,"WAFAction":"unknown","WAFFlags":"0","WAFMatchedVar":"","WAFProfile":"unknown","WAFRuleID":"","WAFRuleMessage":""}

using filebeat as input
input_type: log
json.keys_under_root: true
json.overwrite_keys: false
json.add_error_key: true

and logstash filter like this
filter {
date {
match => ["timestamp", "yyyy-MM-dd HH:mm:ss,SSS", "yyyy-MM-dd HH:mm:ss", "ISO8601"]
target => "@timestamp"
remove_field => "timestamp"
}
mutate {
gsub => [
"message", '"{', "'{",
"message", '}"', "}'"
]
}
kv {
source => "message"
remove_field => ["message"]
field_split => ","
value_split => ":"
trim_key => " "
trim_value => " "
}
}

But Kibana doesn't show the fields splitted correctly
In particular the first and the last.
Could you help me ?
Thanks

Don't parse JSON with mutate and kv filters. Use a json filter or json codec in Logstash or enable Filebeat's JSON parser (which it seems like you've done).

So you suggest me to remove completely the logstash filter above and all works fine?

great!
I've only added on 02-beats.conf the codec json
codec => "json"

Thanks

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.