Hello Logstash community
I set up ELK (filebeat>logstash>elasticsearch>kibana) on a Windows 2012 server last week, and have been trying to configure Logstash to parse my logs correctly. I've got it mostly correct, but seem to be having trouble with the timestamp conversion.
Overview
An example of the log message I'm trying to parse is
2016-07-04 12:57:03,223 CAT INFO log.kernel 1019:0005 [ebx-scheduler-worker-1] Processing Product Pid: 655
My logstash filter is:
filter {
grok {
patterns_dir => ["./patterns"]
match => { "message" => "%{TIMESTAMP_ISO8601:log_timestamp} CAT %{LOGLEVEL:loglevel}\s*log.%{WORD:logtype} %{NUMBER}:%{NUMBER} %{SYSLOG5424SD:thread} %{GREEDYDATA:log_message}" }
}
date{
match => [ "log_timestamp", "YYYY-MM-DD HH:mm:ss,SSS" ]
}}
and the output for this log entry is:
{
"message": "2016-07-04 12:57:03,223 CAT INFO log.kernel 1019:0005 [ebx-scheduler-worker-1] Processing Product Pid: 655",
"@version": "1",
"@timestamp": "2016-01-04T10:57:03.223Z",
"count": 1,
"fields": null,
"input_type": "log",
"beat": {
"hostname": "ZAAFHVMEBX01",
"name": "ZAAFHVMEBX01"
},
"source": "C:\EBX5\EBXHome\ebxLog\kernel.log",
"offset": 5446815,
"type": "log",
"host": "ZAAFHVMEBX01",
"tags": ["beats_input_codec_plain_applied"],
"log_timestamp": "2016-07-04 12:57:03,223",
"loglevel": "INFO",
"logtype": "kernel",
"thread": "[ebx-scheduler-worker-1]",
"log_message": "Processing Product Pid: 655"
}
Problem
The field log_timestamp is being correctly identified:
"log_timestamp": "2016-07-04 12:57:03,223"
but there appears to be a problem with it's conversion to @timestamp:
"@timestamp": "2016-01-04T10:57:03.223Z"
The date in the log_timestamp field is 4 July 2016, while the date in the @timestamp field is 4 January 2016.
Please can someone assist me with getting the correct date conversion.
Kind regards
Craig