Data parse error ["_dateparsefailure"]

data parse error ["_dateparsefailure"]
filter{

date{
match=>["date","yyyy-MM-dd HH:mm:ss:SSS Z"]
target=>"logdata"
}

mutate {
remove_field => [ "date" ]
}
where date inside the match is the field in db..
"date":"2018-07-09T06:36:44.254Z"

Your pattern clearly doesn't match the actual timestamp. Try using ISO8601 as the date pattern. If that doesn't work use YYYY-MM-dd'T'HH:mm:ss.SSS'Z'. Remember to also set timezone => "UTC".

still it is not resolved..
"@timestamp" => 2018-07-16T07:06:29.777Z,
"uid" => 14,
"tags" => [
[0] "_dateparsefailure"
],
"first_name" => "e",
"last_name" => "e",
"@version" => "1",
"email" => "en@example.com"

If the date filter fails to parse a timestamp it'll log a message about it in the Logstash log. Find that message and post it along with your current configuration.

this is what showing in logstash ..
"tags" => [
[0] "_dateparsefailure"
],

No, I'm talking about Logstash's own log file, typically named logstash-plain.log.

i have used simple stdout{} ..it displays as follows in logstash-plain:

[2018-07-17T17:13:01,258][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-07-17T17:13:01,941][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.3.0"}
[2018-07-17T17:13:09,072][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2018-07-17T17:13:09,963][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x54e0b9e6 sleep>"}
[2018-07-17T17:13:10,101][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2018-07-17T17:13:10,502][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2018-07-17T17:14:03,192][INFO ][logstash.inputs.jdbc ] (0.251407s) SELECT CAST(current_setting('server_version_num') AS integer) AS v
[2018-07-17T17:14:03,562][INFO ][logstash.inputs.jdbc ] (0.130932s) SELECT count(*) AS "count" FROM (SELECT * from contacts WHERE date between '2017-07-17 10:24:00.394000+0530' and CURRENT_TIMESTAMP ) AS "t1" LIMIT 1

I'm guessing you're hitting this issue: https://github.com/logstash-plugins/logstash-filter-date/issues/95

converting date to string and then parse with date method works for me
filter {
mutate {
convert => {"log_date" => "string"}
}
date {
match => ["log_date", "ISO8601"]
}
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.