Data is not coming as per custom template in kibana from logstash

Hi ,
i am getting issue while displaying the content in kibana as per my template and can see error in log as _grokparsefailure".
we can see out template are getting install successfully in elastic search
My configuration is

input {
file {
path => "C:/Anuj/ElasticSearch/DMS_GTX-Process_Archive.log"
start_position => "beginning"
sincedb_path => "NUL"
codec => multiline {

  pattern => '^[0-9]{4}'
  negate => true
what => "previous"
}

}
}
filter {

mutate {
gsub => [ "message", "GMT", "" ]
}
mutate {
gsub => [ "message", "-0800", "" ]
}

grok {
match => { "message" =>"%{SYSLOGTIMESTAMP :timestamp} %{NUMBER:num} %{USERNAME:Application} User [%{USERNAME :BWuser}] - %{USERNAME :job} [(?[a-zA-Z0-9./\s]+)]:%{GREEDYDATA:Log}" }
overwrite => ["message"]
}
date {
match => [ "timestamp" , "yyyy MM dd HH:mm:ss,SSS" ]
}
}

output {
stdout { codec=>rubydebug }
elasticsearch {
template_overwrite => true
template_name => "bw-test"
manage_template => true
template => "C:/Anuj/ElasticSearch/template/bw-gtx.json"
hosts => ["localhost:9200"]
index => "bw-test"

}

}

template is
{
"template": "bw-test*",
"order": 1,

"mappings": {
	"doc": {
		"dynamic": false,

		"properties": {
			"@timestamp": {
				"type": "date"
			},
			"loggedTime": {
				"type": "keyword"
			},

			"Application": {
				"type": "keyword"
			},

			"job": {
				"type": "keyword"
			},
			"BwPath": {
				"type": "keyword"
			},
			"Log": {
				"type": "keyword"
			}

		}
	}
}

}
sample input is
2019 Jan 09 16:16:23:897 GMT -0800 BW.DKS_AAd-Process_Archive User [BW-User] - Job-18454 [Processes/CDM Pub.process/Group/LogSN]: CIM CM publish -
subj3 Data=<?xml version="1.0" encoding="UTF-8"?>

Output i can see in kibana is
{
"@version" => "1",
"host" => "PC7745L",
"message" => "2019 Jan 09 16:16:23:898 BW.ghh_GgjhX-Process_Archive User
[BW-User] - Job-1454 [Processes/CkkM Pub.process/Log-Complete]: ChhkmM Outage pub
lish process complete \r\n\r",
"@timestamp" => 2019-03-05T21:41:15.788Z,
"path" => "C:/Anuj/ElasticSearch/DklS_GkhgX-Process_Archive.log",
"tags" => [
[0] "multiline",
[1] "_grokparsefailure"
]
}

Your timestamp does not match SYSLOGTIMESTAMP, and I do not see a NUMBER field following it. The square brackets need to be escaped and the optional unnamed capture group does not work. Also, your date filter needs MMM instead of MM, and a colon instead of a comma. Try

grok {
    match => { "message" =>"^%{YEAR:[@metadata][year]} %{SYSLOGTIMESTAMP:[@metadata][ts]} %{USERNAME:Application} User \[%{USERNAME:BWuser}\] - %{USERNAME:job} \[[^\]]*\]:%{GREEDYDATA:Log}" }
    add_field => { "timestamp" => "%{[@metadata][year]} %{[@metadata][ts]}" }
}
date { match => [ "timestamp" , "yyyy MMM dd HH:mm:ss:SSS" ] }

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.