Split date(yyyymmdd:hhmmss) and convert into default timestamp logstash

I have an input as
20170301:18544482:INFO 10.0.0.67 ABCLOG sending request
20170301:18321276:ERROR 10.0.0.67 ABCLOG got response

Above are sample input for my logstash, where input is in foll format
yyyymmdd:hhmmssSS:loglevel IP message

How to I split the above mentioned timing and convert it into @timestamp format so that i can use the log timing as my default time in ES and Kibana.

Use a grok filter to extract the different parts into their own fields, then use a date filter.

I have written a simple grok
Input-: 20170301:18544482:INFO 10.0.0.67 ABCLOG sending request

%{POSINT:time1}:%{POSINT:time2}:%{WORD:loglevel} %{IP:ip} %{WORD:logdata} %{GREEDYDATA:message}

Output-:
{
"time1":"20170301",
"time2":"18544482",
"loglevel":"INFO",
"ip":"10.0.0.67",
"logdata": "ABCLOG",
"message": "sending request"

Now how do I use time1 and time2 field to get a timestamp in below format,
YYYY-MM-ddThh:mm:ss.SS

Later I will use the above timestamp as default in kibana.

PS: How to break time1 and time2 field to get year and date etc is my real query.

You can e.g. combine the time1 and time2 fields into a single field by adding this to your grok filter:

add_field => {
  "timestamp" => "%{time1} %{time2}"
}
remove_field => ["time1", "time2"]

This'll give you a timestamp field containing "20170301 18544482". Feed that to the date filter:

date {
  match => ["timestamp", "YYYYMMdd HHmmssSS"]
}

Although I'm not sure what "18544482" means. Is "82" the number of milliseconds but without a leading zero? What happens if the milliseconds are greater than 100? Will we see e.g. "185444182" then?

Thanks a ton magnus. Usage of date is more clearer to me now.
And yes for millisec we take only the first 2 digits. If millisec is greater than 100 say 869, then we get this-> 18544386 number as output.
We discard the last digit in millisec.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.