I have been trying to get the timestamp to match the log entry and not the time it was ingested. I am using logstash 8.12.2 My log lines look like this:
filter {
if "JSON" in [tags] {
grok {
match => {
"message" => [
"^(?m)\{\"Timestamp\"\:\"%{TIMESTAMP_ISO8601:Timestamp}\"\,%{GREEDYDATA:json_data}$"
]
}
}
}
mutate{
gsub => ["json_data", "^", "{"]
}
json {
source => "json_data"
}
mutate{
remove_field => ["message"]
remove_field => ["json_data"]
}
#only the things below are not working
date {
match => ["Timestamp", "ISO8601", "yyyy-MM-dd HH:mm:ss:SSSS", "yyyy-MM-dd HH:mm:ss:SSS"]
target => "@timestamp"
}
}
Everything is working except for the date parsing. No matter what I try I keep getting a _dateparsefailure. Anyone have any ideas why I can not get this to work and how to fix it?
Please share the output you are getting in Logstash.
Also, the log line you share is a json, why are you using a grok filter to parse it instead of a json filter?
If you are getting a _dateparsefailure it means that the value of in your Timestamp field is not matching any of the patterns you specified, you need to share a sample document that is giving you this error and the logstash output you are getting.
Is it possible to still just use the JSON filter? If so I will give that a try. I am not sure how to show the output I am getting without showing information I am unable to show.
Something not clear here. It's like you are using the JSON codec on events, then you parse by grok. Read what Leandro said, something is not OK with grok parsing.
Can you copy few raw log/event lines which arrive to LS? The event.orginal field or the message field which is not overwritten.
The "message" and "Message" are not the same fields.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.