Timestamp replace not working

Hi, I understand this might be a very common question asked but I couldn't find any answer that addressed my issue.

Here it goes. I have a log message as shown below that I am publishing from Filebeats into Logstash.
20170128 144622.584437 0005B9427CA0_CU_1 user.info sometext.

Problem: The timestamp used is still the current one and not corresponding to the log message.

I have defined a filter as below:
filter {
date {
match => ["timestamp" , "yyyyMMdd HHmmss.SSS"]
target => "@timestamp"
add_field => { "debug" => "timestampMatched"}
}
}

Logstash log messages:
[2017-07-17T12:58:42,473][DEBUG][logstash.pipeline ] filter received {"event"=>{"@timestamp"=>2017-07-17T07:28:41.042Z, "offset"=>746730, "@version"=>"1", "beat"=>{"hostname"=>"node1", "name"=>"node1", "version"=>"5.5.0"}, "input_type"=>"log", "host"=>"node1", "source"=>"/root/samplelogs/Debug.log", "message"=>"20170128 144622.584437 0005B9427CA0_CU_1 user.info sometext", "type"=>"log", "tags"=>["beats_input_codec_plain_applied"]}}

[2017-07-17T12:58:42,473][DEBUG][logstash.pipeline ] output received {"event"=>{"@timestamp"=>2017-07-17T07:28:41.042Z, "offset"=>746730, "@version"=>"1", "beat"=>{"hostname"=>"node1", "name"=>"node1", "version"=>"5.5.0"}, "input_type"=>"log", "host"=>"node1", "source"=>"/root/samplelogs/Debug.log", "message"=>"20170128 144622.584437 0005B9427CA0_CU_1 user.info sometext", "type"=>"log", "tags"=>["beats_input_codec_plain_applied"]}}

What seems to be the issue here? Is the timestamp format missing something?

-Thanks
Nikhil

Edit: Updated the post after removing incorrect ":" character. Issue is still not resolved.

Where's the timestamp field that you're asking the date filter to parse? How are you creating that field?

Its at the beginning of the line. It is created by syslogd. Thanks.

Yes, I know it's in the message field but according to the log entry you posted,

[2017-07-17T12:58:42,473][DEBUG][logstash.pipeline ] output received {"event"=>{"@timestamp"=>2017-07-17T07:28:41.042Z, "offset"=>746730, "@version"=>"1", "beat"=>{"hostname"=>"node1", "name"=>"node1", "version"=>"5.5.0"}, "input_type"=>"log", "host"=>"node1", "source"=>"/root/samplelogs/Debug.log", "message"=>"20170128 144622.584437 0005B9427CA0_CU_1 user.info sometext", "type"=>"log", "tags"=>["beats_input_codec_plain_applied"]}}

the event has no timestamp field which would explain why the date filter isn't working.

Ohh. I thought the filter will match if the timestamp is part of the message. So does it mean I have to first parse the message and extract the timestamp into a separate field and then apply the replace operation?
In other words, have 2 filters, one grok to extract and then date to replace?
Thanks.

Hi Magnus,

I modified my filter as below:

filter {
  grok {
    patterns_dir => ["/root/logstash-5.5.0/patterns"]
    match => { "message" => "^%{CSSYSLOGTIMESTAMP:syslog_timestamp} %{DATA:syslog_hostname} %{DATA:syslog_level} %{DATA:app_name}: %{GREEDYDATA:syslog_message}" }
  }
  date {
    match => ["syslog_timestamp" , "yyyyMMdd HHmmss.SSS"]
    target => "@timestamp"
    add_field => { "debug" => "timestampMatched"}
  }
}
> [2017-07-18T10:25:01,152][DEBUG][logstash.pipeline        ] filter received {"event"=>{"@timestamp"=>2017-07-18T04:54:55.170Z, "offset"=>747452, "@version"=>"1", "input_type"=>"log", "beat"=>{"hostname"=>"node1", "name"=>"node1", "version"=>"5.5.0"}, "host"=>"node1", "source"=>"/root/samplelogs/Debug.log", "message"=>"20170119 144002.184140 0005B9427CA0_CU_1 user.notice ProcMon:: 10.220.0.13 is valid IP Address", "type"=>"log", "tags"=>["beats_input_codec_plain_applied"]}}
> [2017-07-18T10:25:01,154][DEBUG][logstash.filters.grok    ] Running grok filter {:event=>2017-07-18T04:54:55.170Z node1 20170119 144002.184140 0005B9427CA0_CU_1 user.notice ProcMon:: 10.220.0.13 is valid IP Address}
> [2017-07-18T10:25:01,159][DEBUG][logstash.filters.grok    ] Event now:  {:event=>2017-07-18T04:54:55.170Z node1 20170119 144002.184140 0005B9427CA0_CU_1 user.notice ProcMon:: 10.220.0.13 is valid IP Address}
> [2017-07-18T10:25:01,165][DEBUG][logstash.pipeline        ] output received {"event"=>{"MSEC"=>"184140", "offset"=>747452, "input_type"=>"log", "source"=>"/root/samplelogs/Debug.log", "message"=>"20170119 144002.184140 0005B9427CA0_CU_1 user.notice ProcMon:: 10.220.0.13 is valid IP Address", "type"=>"log", "syslog_message"=>"10.220.0.13 is valid IP Address", "tags"=>["beats_input_codec_plain_applied", "_dateparsefailure"], "app_name"=>"ProcMon:", "@timestamp"=>2017-07-18T04:54:55.170Z, "syslog_hostname"=>"0005B9427CA0_CU_1", "syslog_timestamp"=>"20170119 144002.184140", "@version"=>"1", "beat"=>{"hostname"=>"node1", "name"=>"node1", "version"=>"5.5.0"}, "host"=>"node1", "syslog_level"=>"user.notice"}}

My new timestamp field (syslog_timestamp) is now created. However the @timestamp value is still not replaced. What am I doing wrong?
Thanks.

The problem might be that your timestamp contains microseconds but the "SSS" in your date pattern only capture milliseconds. Try using "SSSSSS" instead, or strip the last three digits with a mutate filter's gsub option. Elasticsearch's resolution is just milliseconds anyway.

Yes, I found the problem too and was going to reply the same here. :slight_smile:

"Found the issue. I was only using upto 3 decimals (SSS) because of this text mentioned in the reference doc.
"S: fraction of a second Maximum precision is milliseconds (SSS). Beyond that, zeroes are appended."
But it turns out you still need to use 6 S's while defining the filter. "

Thanks for your help.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.