Date format not working in logstash 6.3

I am using logstash 6.3 and following is my input and message. For some reason it is not converting String to Date.

input { stdin { codec => json } }
output { stdout { codec => rubydebug } }
filter {
  date {
        match => ["Report_Host_Start", "EEE MMM dd HH:mm:ss yyyy","EEE MMM d HH:mm:ss yyyy"]
        target => "Report_Host_Start"
        locale => "en-US"
    }
  #ruby {
	#	code => "event.set('@timestamp', event.get('Report_Host_Start'));"
	#} 
}

Here is input to above config

{"Report_Host_Start": "Tue Jun 5 14:27:27 2018", "syslog_hostname": "foo"}

Here is the output I am getting with _dateparsefailure

{
             "@version"←[0;37m => ←[0m←[0;33m"1"←[0m,
                 "host"←[0;37m => ←[0m←[0;33m"SVYAHALKAR440"←[0m,
    "Report_Host_Start"←[0;37m => ←[0m←[0;33m"Tue Jun  5 14:27:27 2018"←[0m,
      "syslog_hostname"←[0;37m => ←[0m←[0;33m"foo"←[0m,
           "@timestamp"←[0;37m => ←[0m2018-09-03T19:42:45.545Z,
                 "tags"←[0;37m => ←[0m[
        ←[1;37m[0] ←[0m←[0;33m"_dateparsefailure"←[0m
    ]
}

I am testing this on Windows but I get the same issue on Linux as well.

Please can someone please help here.

In this particular example there are two spaces between "Jun" and "5" but your date pattern has just a single space.

But check your logs for clues. The date filter logs a message indicating which part of the string it has problems with.

I am new to elastic stack. I tried running logstash in debug mode and info mode but I am not able to get more information where exactly the issue is coming from. I would really appreciate if you can point me to right direction.

[2018-09-03T14:30:52,848][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2018-09-03T14:30:54,064][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-09-03T14:30:54,064][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-09-03T14:30:57,295][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x6ce2f5f4 sleep>"}
[2018-09-03T14:30:59,055][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-09-03T14:30:59,057][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-09-03T14:31:02,296][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x6ce2f5f4 sleep>"}
[2018-09-03T14:31:04,070][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-09-03T14:31:04,071][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-09-03T14:31:07,315][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x6ce2f5f4 sleep>"}
[2018-09-03T14:31:09,081][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-09-03T14:31:09,090][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-09-03T14:31:12,294][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x6ce2f5f4 sleep>"}
{"Report_Host_Start": "Tue Jun  5 14:27:27 2018", "syslog_hostname": "foo"}
[2018-09-03T14:31:14,018][DEBUG][logstash.pipeline        ] filter received {"event"=>{"syslog_hostname"=>"foo", "Report_Host_Start"=>"Tue Jun  5 14:27:27 2018", "@timestamp"=>2018-09-03T21:31:13.855Z, "@version"=>"1", "host"=>"SVYAHALKAR440"}}
[2018-09-03T14:31:14,074][DEBUG][logstash.pipeline        ] output received {"event"=>{"syslog_hostname"=>"foo", "Report_Host_Start"=>"Tue Jun  5 14:27:27 2018", "@timestamp"=>2018-09-03T21:31:13.855Z, "@version"=>"1", "host"=>"SVYAHALKAR440", "tags"=>["_dateparsefailure"]}}
[2018-09-03T14:31:14,102][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-09-03T14:31:14,105][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
{
      "syslog_hostname"←[0;37m => ←[0m←[0;33m"foo"←[0m,
    "Report_Host_Start"←[0;37m => ←[0m←[0;33m"Tue Jun  5 14:27:27 2018"←[0m,
           "@timestamp"←[0;37m => ←[0m2018-09-03T21:31:13.855Z,
             "@version"←[0;37m => ←[0m←[0;33m"1"←[0m,
                 "host"←[0;37m => ←[0m←[0;33m"SVYAHALKAR440"←[0m,
                 "tags"←[0;37m => ←[0m[
        ←[1;37m[0] ←[0m←[0;33m"_dateparsefailure"←[0m
    ]
}
[2018-09-03T14:31:17,322][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x6ce2f5f4 sleep>"}
[2018-09-03T14:31:19,162][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-09-03T14:31:19,162][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-09-03T14:31:22,322][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x6ce2f5f4 sleep>"}
[2018-09-03T14:31:24,178][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}

Magnus,

Dude you are awesome. I was thinking that since I have two patterns in my filter it would pick up single digit date. But I had to provide extra space. Its working now. Thanks for your help.

input { stdin { codec => json } }
output { stdout { codec => rubydebug } }
filter {
  date {
        match => ["Report_Host_Start", "EEE MMM dd HH:mm:ss yyyy","EEE MMM  dd HH:mm:ss yyyy","EEE MMM d HH:mm:ss yyyy"]
        target => "Report_Host_Start"
        locale => "en-US"
    }
  #ruby {
	#	code => "event.set('@timestamp', event.get('Report_Host_Start'));"
	#} 
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.