Date Filter - _dateparsefailure

Dear all,

I'm running into a strange problem converting syslogtimestamp field into @timestamp.

After reading the thread:, I can't make it to work. When I launch this simple command, it failed:

echo "Nov  2 10:40:49" | logstash -e 'input { stdin {} } filter { date { match => [ "message", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss"] } }'

      "@version" => "1",
       "message" => "Nov 2 10:40:49",
    "@timestamp" => 2022-06-14T21:05:58.605Z,
          "tags" => [
        [0] "_dateparsefailure"
          "host" => ""

I've also tried to use the MMM<space><space>d HH:mm:ss, but logstash failed with this error:

[ERROR] 2022-06-14 17:02:54.415 [Converge PipelineAction::Create<main>] agent - Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"Java::JavaLang::IllegalStateException", :message=>"Unable to configure plugins: Illegal pattern component: p", :backtrace=>["<init>(", "org.logstash.execution.JavaBasePipelineExt.initialize(", "org.logstash.execution.JavaBasePipelineExt$INVOKER$i$1$0$$INVOKER$i$1$0$initialize.gen)", "org.jruby.internal.runtime.methods.JavaMethod$", "", "", "", "usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.RUBY$method$initialize$0(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:48)", "", "", "org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(", "", "org.jruby.RubyClass.newInstance(", "org.jruby.RubyClass$INVOKER$i$$INVOKER$i$newInstance.gen)", "", "usr.share.logstash.logstash_minus_core.lib.logstash.pipeline_action.create.RUBY$method$execute$0(/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:52)", "usr.share.logstash.logstash_minus_core.lib.logstash.pipeline_action.create.RUBY$method$execute$0$__VARARGS__(/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:50)", "", "", "", "usr.share.logstash.logstash_minus_core.lib.logstash.agent.RUBY$block$converge_state$2(/usr/share/logstash/logstash-core/lib/logstash/agent.rb:388)", "org.jruby.runtime.CompiledIRBlockBody.callDirect(", "", "", "", "", "", "java.base/"]}
warning: thread "Converge PipelineAction::Create<main>" terminated with exception (report_on_exception is true):
LogStash::Error: Don't know how to handle `Java::JavaLang::IllegalStateException` for `PipelineAction::Create<main>`
          create at org/logstash/execution/
             add at org/logstash/execution/
  converge_state at /usr/share/logstash/logstash-core/lib/logstash/agent.rb:401
[ERROR] 2022-06-14 17:02:54.421 [Agent thread] agent - An exception happened when converging configuration {:exception=>LogStash::Error, :message=>"Don't know how to handle `Java::JavaLang::IllegalStateException` for `PipelineAction::Create<main>`"}
[FATAL] 2022-06-14 17:02:54.438 [LogStash::Runner] runner - An unexpected error occurred! {:error=>#<LogStash::Error: Don't know how to handle `Java::JavaLang::IllegalStateException` for `PipelineAction::Create<main>`>, :backtrace=>["org/logstash/execution/ `create'", "org/logstash/execution/ `add'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:401:in `block in converge_state'"]}
[FATAL] 2022-06-14 17:02:54.445 [LogStash::Runner] Logstash - Logstash stopped processing because of an error: (SystemExit) exit
org.jruby.exceptions.SystemExit: (SystemExit) exit
        at org.jruby.RubyKernel.exit(org/jruby/ ~[jruby-complete-]
        at org.jruby.RubyKernel.exit(org/jruby/ ~[jruby-complete-]
        at usr.share.logstash.lib.bootstrap.environment.<main>(/usr/share/logstash/lib/bootstrap/environment.rb:94) ~[?:?]

So I'm a bit confused on how to make this syslog timestamp date into @timestamp field.

Any idea?

I'm runniong logstash v. 7.17.4.


It works for me. Only thing you are doing differently is starting it from the command line so I would look at that first.

input { stdin {} }
filter {
  date { match => [ "message", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss"] } 
output { stdout { codec => "json_lines" } }


input { generator { count => 1 lines => [ '{ "message": "Nov 2 10:40:49" }' ] codec => json } }
filter {
  date { match => [ "message", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss"] } 
output { stdout { codec => "json_lines" } }

It works for me too.

There is only a single space between the month and the day, but that is fine since the dd in "MMM dd HH:mm:ss" will match one or two digits.

You cannot use <space> to represent a space. Alphabetic characters in the pattern have to be valid Joda time specifiers. The p in "space" is causing this exception, if you removed that it would complain about the c. If you removed that your pattern would match Nov<59am3>2 10:40:49 which has seconds, am/pm, and numeric day of the week between the <>.

Hi there,

I tried the following:

input { generator { codec => json count => 1 lines => [ '{"message":"Nov  2 10:40:49"}' ] } }

filter {
  date { match => [ "message", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss"] } 

output {
  stdout { 
    codec => rubydebug { metadata => true } 

To launch it, I'm using logstash -f pipeline-test6.conf. I'm still running into the same error:

      "sequence" => 0,
    "@timestamp" => 2022-06-15T12:39:35.631Z,
          "host" => "",
      "@version" => "1",
       "message" => "Nov  2 10:40:49",
          "tags" => [
        [0] "_dateparsefailure"

Do you think it may be related to the LANG of my server? It has LANG=en_CA.UTF-8 as value.

Don't understand why it is working on your system, but mine... :pensive:


Hi Badger!

My bad! I tought the '<space>' was something like we have into grok filter such as %{SPACE}.

But like I answered to aaron, even uysing his script will continue to fail with _dateparsefailure.


Yes, that is the problem. Someone else had this problem last week and solved it by using mutate+gsub to change the abbreviated month name to a month number. The problem is that that particular locale requires a period at the end of the month abbreviation. So May 2 10:40:49 works, and Nov. 2 10:40:49 works, but Nov 2 10:40:49 cannot be parsed.

Ho wow! You exactly right!

Having this May 14 15:05:01 will parse correctly, while having Feb 14 15:05:01 will failed with _dateparsefailure tag :roll_eyes:

When I tried with a date such as Feb. 14 15:05:01 this time, my grok parser will failed _grokparsefailure_sysloginput.The grok pattern %{SYSLOGTIMESTAMP} doesn't seems to understand the '.' after the month name.

I will try to implement to solution from the other guy. I also open a case at Elastic letting them know this issue. From their documentation for date parser we can read the following:

The date filter is used for parsing dates from fields, and then using that date or timestamp as the logstash timestamp for the event.

For example, syslog events usually have timestamps like this:

`"Apr 17 09:32:01"`

You would use the date format MMM dd HH:mm:ss to parse this.

The date filter is especially important for sorting events and for backfilling old data. If you don’t get the date correct in your event, then searching for them later will likely sort out of order.

So I'm expecting this to work!

Thanks again Badger, your help is very appreciated!

Hi there!

Here it is how I solved it and it is working fine. I have to play a bit because syslogs doesn't send the year into the date field. So it look like this:

  mutate {
    gsub => ["syslog.timestamp", "Jan ", "01-"]
    gsub => ["syslog.timestamp", "Feb ", "02-"]
    gsub => ["syslog.timestamp", "Mar ", "03-"]
    gsub => ["syslog.timestamp", "Apr ", "04-"]
    gsub => ["syslog.timestamp", "May ", "05-"]
    gsub => ["syslog.timestamp", "Jun ", "06-"]
    gsub => ["syslog.timestamp", "Jul ", "07-"]
    gsub => ["syslog.timestamp", "Aug ", "08-"]
    gsub => ["syslog.timestamp", "Sep ", "09-"]
    gsub => ["syslog.timestamp", "Oct ", "10-"]
    gsub => ["syslog.timestamp", "Nov ", "11-"]
    gsub => ["syslog.timestamp", "Dec ", "12-"]    

   date {
     match => ["syslog.timestamp","MM-dd HH:mm:ss", "MM- d HH:mm:ss"]
     timezone => "America/Toronto"
     target => "@timestamp"
     remove_field => [ "syslog.timestamp" ]

Thank you so much @Badger, @yhache and @aaron-nimocks!

It is not really something Elastic can address (except in terms of documentation). The problem is way down in the Unicode CLDR data that the JVM includes.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.