Data parse error ["_dateparsefailure"]

(Gokul Kathirvel) #1

data parse error ["_dateparsefailure"]

match=>["date","yyyy-MM-dd HH:mm:ss:SSS Z"]

mutate {
remove_field => [ "date" ]
where date inside the match is the field in db..

(Magnus Bäck) #2

Your pattern clearly doesn't match the actual timestamp. Try using ISO8601 as the date pattern. If that doesn't work use YYYY-MM-dd'T'HH:mm:ss.SSS'Z'. Remember to also set timezone => "UTC".

(Gokul Kathirvel) #3

still it is not resolved..
"@timestamp" => 2018-07-16T07:06:29.777Z,
"uid" => 14,
"tags" => [
[0] "_dateparsefailure"
"first_name" => "e",
"last_name" => "e",
"@version" => "1",
"email" => ""

(Magnus Bäck) #4

If the date filter fails to parse a timestamp it'll log a message about it in the Logstash log. Find that message and post it along with your current configuration.

(Gokul Kathirvel) #5

this is what showing in logstash ..
"tags" => [
[0] "_dateparsefailure"

(Magnus Bäck) #6

No, I'm talking about Logstash's own log file, typically named logstash-plain.log.

(Gokul Kathirvel) #7

i have used simple stdout{} displays as follows in logstash-plain:

[2018-07-17T17:13:01,258][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-07-17T17:13:01,941][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.3.0"}
[2018-07-17T17:13:09,072][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2018-07-17T17:13:09,963][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x54e0b9e6 sleep>"}
[2018-07-17T17:13:10,101][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2018-07-17T17:13:10,502][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2018-07-17T17:14:03,192][INFO ][logstash.inputs.jdbc ] (0.251407s) SELECT CAST(current_setting('server_version_num') AS integer) AS v
[2018-07-17T17:14:03,562][INFO ][logstash.inputs.jdbc ] (0.130932s) SELECT count(*) AS "count" FROM (SELECT * from contacts WHERE date between '2017-07-17 10:24:00.394000+0530' and CURRENT_TIMESTAMP ) AS "t1" LIMIT 1

(Magnus Bäck) #8

I'm guessing you're hitting this issue:

(Gokul Kathirvel) #9

converting date to string and then parse with date method works for me
filter {
mutate {
convert => {"log_date" => "string"}
date {
match => ["log_date", "ISO8601"]

(system) #10

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.