@timestamp is not getting updated through date filter. getting _dateparsefailure

(Fredrick Yessaian) #1

I have following logstash configuration for Websphere App server SystemErr logs.
As per the grok pattern, I do see the logs are getting parsed, but Overriding @timestamp is not working. I'm getting _dateparsefailure.
here is the configuration

input {
	file {
		path => ["/Path/System*.log"]
		start_position => "beginning"
		sincedb_path => "/Path/sincedbfile_WAS.txt"
		codec => multiline {
			pattern => "^\["
			negate => true
			what => "previous"
		exclude => ["*.gz","native_std*.log"]
		type => "WAS_logs"
		tags => "WAS_logs"		
	grok {
		patterns_dir => ["./appl/grokpattern"]
		match => ["message", "\[%{TimeDate:WAS_TimeStamp}\] %{BASE16NUM:WAS_ThreadId} %{WORD:WAS_EventType}%{SPACE}%{WORD:WAS_LogLevel}%{SPACE}%{GREEDYDATA:WAS_LogMessage}"]
		overwrite => [ "message" ]
	date {
		match => ["WAS_TimeStamp", "M/dd/YY HH:mm:ss:SSS ZZZ", "MM/d/YY HH:mm:ss:SSS ZZZ", "M/d/YY HH:mm:ss:SSS ZZZ", "MM/dd/YY H:mm:ss:SSS ZZZ", "M/d/YY H:mm:ss:SSS ZZZ", "MM/d/YY H:mm:ss:SSS ZZZ", "M/dd/YY H:mm:ss:SSS ZZZ"]
		target => "@timestamp"
output {
	elasticsearch {
		hosts => ["HOST1:9200","HOST2:9200"]
		index => "WAS-logs-%{+YYYY.MM.dd}"
        file {
                path => "/appl/logstash/LogStash_WAS_log_output.log"

I use custom patterns too.

TimeDate (?(%{DATE} %{TIME} %{TIMEZONE}))
DATE (?(\d{1,2}/\d{1,2}/\d{2}))
TIME (?(\d{1,2}:\d{1,2}:\d{1,2}:\d{1,3}))
TIMEZONE (?(\S{3}))

This is the log
[6/22/18 1:01:23:615 EDT] 00000078 SystemErr R at org.apache.openjpa.jdbc.kernel.JDBCStoreManager.initializeState(JDBCStoreManager.java:322)

below is the processed string by logstash

"TimeDate": "6/22/18 1:01:23:615 EDT",
"WAS_TimeStamp": "6/22/18 1:01:23:615 EDT"
"@timestamp": "2018-06-27T19:17:35.304Z",
"TIME": "1:01:23:615",
"WAS_EventType": "SystemErr",
"WAS_LogLevel": "R",
"host": "Host1",
"type": "WAS_logs",
"DATE": "6/22/18",
"tags": [
"WAS_LogMessage": "at org.apache.openjpa.jdbc.kernel.JDBCStoreManager.initializeState(JDBCStoreManager.java:322)\r",
"message": "[6/22/18 1:01:23:615 EDT] 00000078 SystemErr R \tat org.apache.openjpa.jdbc.kernel.JDBCStoreManager.initializeState(JDBCStoreManager.java:322)\r",
"@version": "1",
"path": "/appl/logfiles/SystemOut_Test.log",
"WAS_ThreadId": "00000078",

To me it looks like .. code wise everything is correct.
But not sure why @timestamp is not getting updated with WAS_TimeStamp.
Could you check what's wrong this config? Thanks in Advance.


(Magnus B├Ąck) #2

If the date filter fails to parse a string it'll give you details about the failure in the Logstash log.

The problem might be the "EDT" timezone name. I don't think Logstash can parse those. Unless the timezone is always EST/EDT you may have to extract it into a field of its own and use a translate filter to translate it into an UTC offset that the date filter accepts. (The reason it can't parse timezone names is that they're ambiguous. Take CST for example, what's the UTC offset for that?)

(Fredrick Yessaian) #3


For one another logs, I have been usisg below configuration to parse date stuffs.
That also has the time zone. that is getting parsed and works great.
Below is the configuration and JSON log string to be parsed

filter {
		locale => "en"
		match => ["loggingTime","EEE d MMM yyyy HH-mm-ss SSS z"]
		target => "@timestamp"

   "loggingTime":"Mon 9 Jul 2018 05-40-15 135 EDT",
   "threadName":"SchedulerWorkManager.Alarm Pool : 0",
   "logMessage":"method - process(taskStatus)"

But my first post is from WAS logs.. still fails.

(Fredrick Yessaian) #4

I solved the date issue using below cofig.

	date {
		match => ["WAS_TimeStamp", "M/dd/yy HH:mm:ss:SSS z", "MM/d/yy HH:mm:ss:SSS z", "M/d/yy HH:mm:ss:SSS z", "MM/dd/yy H:mm:ss:SSS z", "M/d/yy H:mm:ss:SSS z", "MM/d/yy H:mm:ss:SSS z", "M/dd/yy H:mm:ss:SSS z"]
		target => "@timestamp"

I changed year from Y to y and time zone from ZZZ to z

(system) #5

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.