How to make filter when using json format file


(Young Heon Kim) #1

Hi everyone.

I would like to parse json format log file which logged on Percona audit plugin.
Actually format of log file is following


Nov 25 23:25:18 serverHostName percona-audit: {"audit_record":{"name":"Query", "record":"01T00:00:00", "timestamp":"2015-11-26T07:25:18 UTC", "command_class":"select", "connection_id":"3", "status":0,"sqltext":"select version()","user":"root[root] @ localhost []","host":"localhost","os_user":"","ip":""}}

How can I make filter file ?


(Magnus B├Ąck) #2

Use a grok filter to extract the timestamp, hostname, whatever percona-audit is, and finally the JSON payload into separate fields. The log format looks very similar to syslog so it should be easy to find something that's very close to what you need (and http://grokconstructor.appspot.com/ can also be helpful). Then use a json filter to parse the field with the JSON payload.


(Young Heon Kim) #3

Thank you for your answer.
But, actually I use logstash at first time, so I don't have any knowledge.
I try to add_field of json module, but I can't parse.

Could you show me some sample or config file ?


(Young Heon Kim) #4

Hi @magnusbaeck
I can parse this log.

The site which you talked is very helpful for me.
Thanks a lot.

I share config file for percona audit log.


input { stdin { } }
output { stdout { codec => "rubydebug" } }
filter {
grok {
match => { "message" => "%{SYSLOGTIMESTAMP:sys_timestamp}%{SPACE}%{HOSTNAME:host_name}%{SPACE} percona-audit: %{GREEDYDATA:json_data}"}
}
json {
source => "json_data"
}
}


Result of parsing


echo 'Nov 25 23:25:18 hostA percona-audit: {"audit_record":{"name":"Query","record":"35_1970-01-01T00:00:00","timestamp":"2015-11-26T07:25:18 UTC","command_class":"select","connection_id":"3","status":0,"sqltext":"select version()","user":"root[root] @ localhost []","host":"localhost","os_user":"","ip":""}}' | /opt/logstash/bin/logstash -f test.config

Logstash startup completed
{
"message" => "Nov 25 23:25:18 hostA percona-audit: {"audit_record":{"name":"Query","record":"35_1970-01-01T00:00:00","timestamp":"2015-11-26T07:25:18 UTC","command_class":"select","connection_id":"3","status":0,"sqltext":"select version()","user":"root[root] @ localhost []","host":"localhost","os_user":"","ip":""}}",
"@version" => "1",
"@timestamp" => "2015-11-27T04:04:16.994Z",
"host" => "hostB",
"sys_timestamp" => "Nov 25 23:25:18",
"host_name" => "hostA",
"json_data" => "{"audit_record":{"name":"Query","record":"35_1970-01-01T00:00:00","timestamp":"2015-11-26T07:25:18 UTC","command_class":"select","connection_id":"3","status":0,"sqltext":"select version()","user":"root[root] @ localhost []","host":"localhost","os_user":"","ip":""}}",
"audit_record" => {
"name" => "Query",
"record" => "35_1970-01-01T00:00:00",
"timestamp" => "2015-11-26T07:25:18 UTC",
"command_class" => "select",
"connection_id" => "3",
"status" => 0,
"sqltext" => "select version()",
"user" => "root[root] @ localhost []",
"host" => "localhost",
"os_user" => "",
"ip" => ""
}
}
Logstash shutdown completed



(system) #5