Getting _dateparsefailure

Hi,

I have date format like 20170616155531767-0500 which is unable to parse. Can you give me solution how can I parse date. I used date filter :

date {
locale => "en"
match => [ "REQUEST_TIMESTAMP", "yyyyMMddHHmmssSSSZ" ]
}

Thanks!

I tried yyyyMMddHHmmssSSS-Z also but getting same -dateparsefailure.

also tried,

date {
#locale => "en"
match => [ "REQUEST_TIMESTAMP", "yyyyMMddHHmmssSSS-Z", "ISO8601" ]

    }

date {
#locale => "en"
match => [ "REQUEST_TIMESTAMP", "yyyyMMddHHmmssSSS-Z" ]

    }

The date filter will log details about its failure so I suggest you look in the Logstash's log file for clues.

Hi,

Thanks for reply.

Its giving REQUEST_TIMESTAMP as string not as date. Not getting any error also.

Works fine here:

$ cat date.config 
input { stdin { } }
output { stdout { codec => rubydebug } }
filter {
  date {
    match => [
      "message",
      "yyyyMMddHHmmssSSSZ"
    ]
  }
}
$ echo '20170616155531767-0500' | /opt/logstash/bin/logstash -f date.config
Settings: Default pipeline workers: 8
Pipeline main started
{
       "message" => "20170616155531767-0500",
      "@version" => "1",
    "@timestamp" => "2017-06-16T20:55:31.767Z",
          "host" => "lnxolofon"
}
Pipeline main has been shutdown
stopping pipeline {:id=>"main"}

Its giving REQUEST_TIMESTAMP as string not as date.

What do you mean?

Not getting any error also.

I find it very hard to believe that a date filter that adds _dateparsefailure doesn't log anything about what caused the error.

Hi Magnusbaeck,

I written date filter as you suggested:

date {
#locale => "en"
match => [ "REQUEST_TIMESTAMP", "yyyyMMddHHmmssSSSZ" ]
}

But REQUEST_TIMESTAMP getting as string not as date. I am able to ingest logs into ES but getting REQUEST_TIMESTAMP as String, I want it in date.

This is my configfile:

input {
file {
path => [ "/abc/xyz.csv" ]
start_position => "beginning"
type => "elastic"
sincedb_path => "/dev/null"
ignore_older => 0
}

}
filter {
csv {
separator => ","

columns => ["abc","assd","REQUEST_TIMESTAMP","RESPONSE_TIMESTAMP"]

    skip_empty_columns => true
    quote_char => "$"

}

     date {
                         match => [ "REQUEST_TIMESTAMP", "yyyyMMddHHmmssSSSZ" ]
                  }

    date {
                          match => [ "RESPONSE_TIMESTAMP", "yyyyMMddHHmmssSSSZ" ]
          
          

    }

}
output {
elasticsearch {
hosts => ["ip"]
index => "xxx-%{+YYYY.MM}"
}

    stdout { codec => rubydebug }

}

I tried same:

Please help...
Thanks!

As your example shows the date filter is working just fine. Please don't post screenshots. Use copy/paste.

Unless configured otherwise the date filter writes the parsed timestamp to the @timestamp field. If you want it stored elsewhere you have to adjust the target option.

Secondly, because the mapping of a field can't be modified you'll have to wait until the next index is created (typically the next day) or you can delete the current index so that it's recreated.

Thanks a lot, It works for me.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.