Hi all,
I am new in ELK solution and currently I am working on Logstash -> Elasticsearch -> Kibana.
I need to parse db2diag.log. I am using grok filter for this. The problem is when I use grok debugger website everything work fine but when I put same filter on logstash _grokparsefailure happen. My log files looks like this:
Blockquote
PID : 5505958 TID : 258 PROC : db2ckpwd 0
INSTANCE: db2inst1 NODE : 000
HOSTNAME: PROD_FEPAPP_DB
EDUID : 258 EDUNAME: db2wdog 0 [db2inst1]
FUNCTION: DB2 UDB, oper system services, sqloSpawnAndWaitForPasswordCheckExe, probe:130
MESSAGE : ZRC=0x800F006A=-2146500502=SQLO_BAD_USER "Bad User"
DIA8117C Error with userid "".
Blockquote
And my grok filter:
(?m)%{GREEDYDATA:DATE} .*?:%{LOGLEVEL:loglevel}.*?:%{GREEDYDATA:pid}.*?:%{GREEDYDATA:tid}.*?:%{GREEDYDATA:proc}\n.*?:%{GREEDYDATA:instance}.*?:%{GREEDYDATA:node}\n.*?:%{GREEDYDATA:hostname}\n.*?:%{GREEDYDATA:eduid}.*?:%{GREEDYDATA:eduname}\n.*?:%{GREEDYDATA:function}.*?:%{GREEDYDATA:probe}\n.*?:%{GREEDYDATA:msg}
Is this correct. Do I have to use other filter then grok for this kind of parsing.
Regards,
Atta