saif3r
(A)
April 4, 2019, 10:19am
1
Hello,
I'm getting my results from JDBC query and as a result I have two columns:
"process_date_local" => 2019-01-08T12:35:52.000Z,
"@timestamp" => 2019-04-04T10:16:33.621Z,
I'm trying to use process_date_local as a @timestamp using date filter, but I'm getting parse failure error. This is what I used:
filter {
date {
match => [ "process_date_local", "YYYY-MM-dd'T'HH:mm:ss.SSSZ"]
}
}
Could you tell me what might be the issue or where can i find error details?
Thanks!
saif3r
(A)
April 4, 2019, 10:54am
2
Seems like i got it solved. I've been missing grok pattern in my config. Here's new filter from my config:
filter {
grok {
match => { "process_date_local" => "%{TIMESTAMP_ISO8601:[@metadata ][timestamp]}" }
}
date {
match => [ "[@metadata ][timestamp]", "yyyy-MM-dd'T'HH:mm:ss.SSSZ"]
}
}
Here's the result:
"process_date_local" => 2019-01-08T12:35:53.000Z,
"@timestamp" => 2019-01-08T12:35:53.000Z,
Badger
April 4, 2019, 11:27am
3
This is known issue . The date filter cannot parse a Logstash::Timestamp. The normal workaround is to convert it to a string before parsing it
mutate { convert => { "process_date_local" => "string" } }
system
(system)
Closed
May 2, 2019, 11:27am
4
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.