Date format parsing

Hi,

The Original format of the date in the events is:

Monday, March 5, 2018 11:58:24:874

The date filter I used:

       date {
                  match => [ "timestamp", "EEEE, MMMM d, yyyy HH:mm:ss:SSS" ]
       }

I am getting an error:

Could not index event to Elasticsearch

"reason"=>"failed to parse [timestamp]"

How can I solve this?

I thought to use ruby to reformat the date.

I did:

       ruby {
              code => "event.set('eventtimestamp', event.get('timestamp').time.strftime('%A, %B %-d, %Y %H:%M:%S:%L'))"
       }

But something is wrong here and I am getting an exception in the time() function.

Any idea?

Thanks
Sharon.

I am getting an error:

Could not index event to Elasticsearch

"reason"=>"failed to parse [timestamp]"

Why keep the timestamp field in the first place? Your date filter will write the parsed result to the @timestamp field.

Is this what you mean?

My logstash code:

    ##########################################
    if [fields][type] == "crmauditcrmcsrsrv" {
    ##########################################        
       mutate {
                add_field => { "[@metadata][fields_type]" => "crmauditcrmcsrsrv" }
       }                            
       grok {
             break_on_match => true
             keep_empty_captures => false
             match => { 
                  message => [                                            
                        "%{AUDITTIME:timestamp}\sThreadID\:%{NUMBER:ThreadDetails}\s%{DATA:service}\sAUDIT:\s%{GREEDYDATA:auditinfo}\s%{HTTPMETHOD:method}\s%{DATA:methodDescription}\s%{DATA:transactionStack}\s\|%{DATA:idNumber}\|"
                 ]
             }
             patterns_dir => "/etc/logstash/patterns"
       }           
       if [auditinfo] =~ /.+/ {
           kv {
                source => "auditinfo"
                value_split => "="
                field_split => " "
           }
       }	
       date {
                  match => [ "timestamp", "EEEE, MMMM d, yyyy HH:mm:ss:SSS" ]
       }
    }

The date pattern:

AUDITTIME %{DAY}, %{MONTH} %{MONTHDAY}, %{YEAR} %{HOUR}:%{MINUTE}:%{SECOND}:%{MILSECOND}

In the stdout of Logstash I see: (Which is right)

     "@timestamp" => 2017-06-29T13:14:14.461Z,
   
      "timestamp" => "Thursday, June 29, 2017 16:14:14:461"

But in the Logstash log I see: (Events aren't indexed into Elasticsearch)

[2018-03-26T14:57:51,265][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash2017.06.29crmauditcrmcsrsrv", :_type=>"log", :_routing=>nil}, 2017-06-29T13:14:14.461Z vpelastic Thursday, June 29, 2017 16:14:14:461 ThreadID:92 com.clariertefy.cih.ertetservicedrfgegs.party.xbeans.ReeretrievePrtgergtayMeansXB.postExecute
AUDIT: User=Admdedwdwsa1 Transaction=Asefferf1-0wdqd002d-00000000-00001ba5 SET CardBankAccount com.clarify.cbo.Field.pay_means.id_number |TJlI|], :response=>{"index"=>{"_index"=>"logstash2017.06.29crmauditcrmcsrsrv", "_type"=>"log", "_id"=>"AWJiK_cfgfgdix8S55454z1F6", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [timestamp]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: \"Thursday, June 29, 2017 16:14:14...\""}}}}}

Thanks
Sharon.

As I said, do you really need to keep the timestamp field when you have @timestamp? If you don't need timestamp you can remove it and the error will disappear.

Now I understand. Thanks!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.