_dateparsefailure when trying to overwrite @timestamp

Hello,

I have been trying to get the timestamp to match the log entry and not the time it was ingested. I am using logstash 8.12.2 My log lines look like this:

{"Timestamp":"2024-06-03 20:18:59.2251", "Message":"BLAH"}

My logstash looks like this:

filter {
	if "JSON" in [tags] {
	grok {
		match => {
			"message" => [
					"^(?m)\{\"Timestamp\"\:\"%{TIMESTAMP_ISO8601:Timestamp}\"\,%{GREEDYDATA:json_data}$"
				]
			}
		}
	}
	mutate{
		gsub => ["json_data", "^", "{"]
	}
	json {
		source => "json_data"
		}
	mutate{
		remove_field => ["message"] 
		remove_field => ["json_data"]
		}
	#only the things below are not working
	date {
		match => ["Timestamp", "ISO8601", "yyyy-MM-dd HH:mm:ss:SSSS", "yyyy-MM-dd HH:mm:ss:SSS"]
		target => "@timestamp"
		}
}

Everything is working except for the date parsing. No matter what I try I keep getting a _dateparsefailure. Anyone have any ideas why I can not get this to work and how to fix it?

Should be:

	date {
		match => ["Timestamp","yyyy-MM-dd HH:mm:ss.SSSS"]
		}

I tried that... and just tried it again and it did not work.

Please share the output you are getting in Logstash.

Also, the log line you share is a json, why are you using a grok filter to parse it instead of a json filter?

If you are getting a _dateparsefailure it means that the value of in your Timestamp field is not matching any of the patterns you specified, you need to share a sample document that is giving you this error and the logstash output you are getting.

1 Like

Can you explain didn't work?

input {
  generator {
       message => "BLAH"
	   count => 1
  }
}
filter {

   mutate { add_field => { "Timestamp" => "2024-06-03 20:11:59.2251" }  } 

   date { match => ["Timestamp","yyyy-MM-dd HH:mm:ss.SSSS"] }

}

output {
    stdout {  }
}

Output:

{
      "@version" => "1",
    "@timestamp" => 2024-06-03T18:11:59.225Z,
         "event" => {
        "original" => "BLAH",
        "sequence" => 0
    },
       "message" => "BLAH",
     "Timestamp" => "2024-06-03 20:11:59.2251"
}

You set target: target => "@timestamp", I didn't no need, LS is using this field as default, and it always will be in UTC TZ.

You put : before milliseconds instead of . that is why you had _dateparsefailure

I guess let me clarify the log line looks more like this:

{"Timestamp":"2024-06-03 20:18:59.2251", "Message":"LoadNextWindow","Caller":{"Class":"Base","Method":"OpenWindow"},"TypeOfApplication":{"ID":"ID","ApplicationName":"Application"}}

Is it possible to still just use the JSON filter? If so I will give that a try. I am not sure how to show the output I am getting without showing information I am unable to show.

@Rios By didn't work I mean when I tried your suggestion I am still getting a _dateparsefailure in Kibana.

Something not clear here. It's like you are using the JSON codec on events, then you parse by grok. Read what Leandro said, something is not OK with grok parsing.

Can you copy few raw log/event lines which arrive to LS? The event.orginal field or the message field which is not overwritten.
The "message" and "Message" are not the same fields.

I got it working! Thank you my mistake was in matching the timestamp I was using an extra ":" instead of a "."

Thank you for all your help!

1 Like