How to store @timestamp in UNIX_MS instead of ISO date format

Currently, @timestamp is in ISO date format. I want to store the same date in epochmilli.

Sample code :

filter {
ruby { code => "event.set('epochs', ((event.get('@timestamp').to_f*1000).to_i).to_s)" }
date {
remove_field => [ "@timestamp" ]
match => [ "epochs","UNIX_MS" ]
target => "@timestamp"
}
}

in the output, there is no @timestamp only "epochs" =>"1548322801689" is observed.

I also tried,

filter {

ruby { code => "event.set('epoch', ((event.get('@timestamp').to_f*1000).to_i).to_s)" }

grok {
remove_field => ["@timestamp"]
}

date {
match => [ "epoch","UNIX_MS" ]
target => "@timestamp"
}
}
Output - @timestamp=>"2019-01-24T09:40:01.689Z" and "epochs" =>"1548322801689"

Could you help ? Thank you.

"decoration", which is what we call the application of common options like remove_field, happens after the filter successfully executes. So in this case it parses epochs into @timestamp, and then removes @timestamp if there were no errors.

For the second case, a date filter creates a Logstash::TimeStamp, which is always going to look like

"@timestamp" => 2019-01-24T13:37:15.155Z

If that is not the output format you want then to not use a date filter.

Thank you for your reply.

Could you suggest a way for getting @timestamp in epochmilli.

Currently, the output is in ISO format "@timestamp": " 2019-01-24T09:40:01.689Z" want it like this "@timestamp" : " 1548322801689"

I do not think that is possible. If you do something like

    ruby { code => "event.set('epoch', ((event.get('@timestamp').to_f*1000).to_i).to_s)" }
    mutate { remove_field => [ "@timestamp" ] }
    mutate { rename => { "epoch" => "@timestamp" } }

It will raise an exception, because the code expects @timestamp to be a LogStash::Timestamp. You have to use a different field name.

Sure, Thank you.

Cheers!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.