Greetings Logstashers,
I wrote a grok parser for one of my logs, the log in question has a unix timestamp.
Here's a line from the log:
{"name":"firefox.exe","pid":"7668","start_time":"1585318348"}
Here's what my grok filter looks like:
filter {
if "osquery" in [tags]{
mutate { gsub => [ "message", "[\[\]]", "" ] }
grok {
match => {
"message" => [ '\"name\":"%{QS:applicationName}",\"pid\":\"%{NUMBER:applicationPid}\",\"start_time\":\"%{NUMBER:applicationDate}\"' ]
}
}
date { match => ["applicationDate", "UNIX"] target => ["applicationDate"] }
mutate { gsub => ["applicationName", "\"", "", "applicationPid", "\"", "", "applicationDate", "\"", ""]}
}
}
I had serious trouble getting my index pattern in Kibana to properly display the UNIX timestamp. The number was parsed and when I formatted the "applicationDate" field in my index pattern to display a string I saw the literal number "1585318348" displayed in the field.
However when I changed the index pattern to reflect the field "applicationDate" to display a date it ALWAYS showed the an epoch time of 0 (so it always said the date was 01.01.1970) only when I added the
date { match => ["applicationDate", "UNIX"] target => ["applicationDate"] }
part was the unix timestamp correctly displayed.
I initially thought Logstash simply creates a json document and json values are either arrays, strings, numbers, or bools and the way that these are displayed is up to the index mapping? Are unix timestamps exempt from this?