Hi,
I'm trying to create a logstash filter to parse the cloud flare json logs like this
{"ClientIP":"x.x.x.x","ClientRequestHost":"www.test.org","ClientRequestMethod":"POST","ClientRequestProtocol":"HTTP/2","ClientRequestReferer":"","ClientRequestURI":"/index.html","ClientRequestUserAgent":"Go-http-client/2.0","ClientSrcPort":59876,"EdgeEndTimestamp":1533254858064000000,"EdgeServerIP":"","EdgeStartTimestamp":1533254857428999936,"OriginIP":"y.y.y.y","OriginResponseStatus":204,"WAFAction":"unknown","WAFFlags":"0","WAFMatchedVar":"","WAFProfile":"unknown","WAFRuleID":"","WAFRuleMessage":""}
using filebeat as input
input_type: log
json.keys_under_root: true
json.overwrite_keys: false
json.add_error_key: true
and logstash filter like this
filter {
date {
match => ["timestamp", "yyyy-MM-dd HH:mm:ss,SSS", "yyyy-MM-dd HH:mm:ss", "ISO8601"]
target => "@timestamp"
remove_field => "timestamp"
}
mutate {
gsub => [
"message", '"{', "'{",
"message", '}"', "}'"
]
}
kv {
source => "message"
remove_field => ["message"]
field_split => ","
value_split => ":"
trim_key => " "
trim_value => " "
}
}
But Kibana doesn't show the fields splitted correctly
In particular the first and the last.
Could you help me ?
Thanks