Grok filter TIMESTAMP_ISO8601 appears in Kibana as String

Hi all,

I done a bit of rooting around google but can't find any definitive answers, so i decided to humble myself and request help directly.

The problem i have is that, like many incoming messages, my log entry contains a timestamp field, here's an example message:

spc01ws14 sales 2020-12-10 09:15:11,909 INFO UIRequest runLifecycleSteps (102 ms) location=PolicyFile; Sources='TabBar/PolicyTab/PolicyTab_PolicyRetrievalItem_widget/PolicyTab_PolicyRetrievalItem_Button'

The datetime format seems to comply with ISO8601 so i use the following regular expression to parse my message in a grok block:

`grok {
      match => {
        "message" => "(?<serverid>([a-z]{3}[0-9]{2}[a-z]{2}[0-9]{2}){0,1})\s+(?<username>([A-Za-z0-9.@\-]*){0,1})\s+(%{TIMESTAMP_ISO8601:logdate})?\s+(%{LOGLEVEL:loglevel})\s+(?<classname>([^\s]+){0,1})\s+%{GREEDYDATA:body}"
      }
      add_field => [ "received_at", "%{@timestamp}" ]
      add_field => [ "received_from", "%{host}" ]
      tag_on_failure => [ "_not_interested" ]
    }`

The problem i have is that the logdate field comes through to Kibana as a string so i cannot apply between filters to this field.

I read that this is expected behaviour, but i can't find an easy answer to solve the problem i have.

I would also like to flip the fields so the "@timestamp" field contains the current logdate information - is that as easy as changing the grok to (%{TIMESTAMP_ISO8601:@timestamp}) ?

Many thanks in advance.

If you want the [@timestamp] field to reflect a value extracted using grok you will need to use a date filter.

Hi @Badger,

I think i've made some limited progress based on what you've said and following this post

My Grok filter / date filters now look like this:

grok {
      match => {
        "message" => "(?<serverid>([a-z]{3}[0-9]{2}[a-z]{2}[0-9]{2}){0,1})\s+(?<username>([A-Za-z0-9.@\-]*){0,1})\s+(%{TIMESTAMP_ISO8601:logdate})?\s+(%{LOGLEVEL:loglevel})\s+(?<classname>([^\s]+){0,1})\s+%{GREEDYDATA:body}"
      }
      add_field => [ "received_at", "%{@timestamp}" ]
      add_field => [ "received_from", "%{host}" ]
      tag_on_failure => [ "_not_interested" ]
    }
    date {
       match => [ "logdate", "YYYY-MM-dd HH:mm:ss,SSS" ]
       target => "logdate"
    }

I was hoping that, following this change, when i created a new index pattern, i'd have the option of using "logdate" as the time field, but it isn't showing in the list and, if i proceed to create the index pattern on @timestamp then logdate still shows as a String in the field list on the next page.

I can see its made a change to the output in Discover view now as the Logdate is now showing slightly differently to the Original Input:

Before

2020-12-14 12:18:51,397

After

2020-12-14T12:18:51.397Z

Any further suggestions?

Many thanks,
Ade.

OK , so i managed to figure out how to resolve this.

I can't write back to the original logdate field as this contains old string information, so i've written this to a new field and Bingo!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.