Date filter needed for unix timestamps?

Greetings Logstashers,

I wrote a grok parser for one of my logs, the log in question has a unix timestamp.
Here's a line from the log:

{"name":"firefox.exe","pid":"7668","start_time":"1585318348"}

Here's what my grok filter looks like:

filter {
  if "osquery" in [tags]{
  mutate { gsub => [ "message", "[\[\]]", "" ] }
  grok {
    match => {
      "message" => [ '\"name\":"%{QS:applicationName}",\"pid\":\"%{NUMBER:applicationPid}\",\"start_time\":\"%{NUMBER:applicationDate}\"'  ]
   }
  }
  date { match => ["applicationDate", "UNIX"] target => ["applicationDate"]  }
  mutate { gsub => ["applicationName", "\"", "", "applicationPid", "\"", "", "applicationDate", "\"", ""]}
 }
}

I had serious trouble getting my index pattern in Kibana to properly display the UNIX timestamp. The number was parsed and when I formatted the "applicationDate" field in my index pattern to display a string I saw the literal number "1585318348" displayed in the field.

However when I changed the index pattern to reflect the field "applicationDate" to display a date it ALWAYS showed the an epoch time of 0 (so it always said the date was 01.01.1970) only when I added the
date { match => ["applicationDate", "UNIX"] target => ["applicationDate"] }
part was the unix timestamp correctly displayed.

I initially thought Logstash simply creates a json document and json values are either arrays, strings, numbers, or bools and the way that these are displayed is up to the index mapping? Are unix timestamps exempt from this?

If you configure elasticsearch to expect a field to be a date, then if logstash tries to index a number elasticsearch will parse it as epoch_millis. So 1585318348 would be January 19th 1970.

That is what I expected too but then I went on to a website to get a unix timestamp of this very same day and it was still parsed as a 1970 date, thats where my confusion came from.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.