Is there a way to map a date to milliseconds since epoch for Logstash?

Hi,

I have a date field (timestmap) (ES type is date) in my current Elasticsearch cluster and, among other things, I want to use Logstash to create a field consisting of the milliseconds since epoch for the timestamp. Now, https://www.elastic.co/guide/en/logstash/current/plugins-filters-date.html shows that there are plenty of ways to go from milliseconds since epoch to a date, but is there a way to go the other way (from date to milliseconds since epoch)?

Cheers,
EuclideanSearch

You could use the ruby filter to do something like this.

Thank you hartfordfive. Similarly, is there a 'out of the box' way to get a ISO8601 String from a date field or will a ruby script be required for that as well'?

When you say going from a date field to an ISO8601 date string, what exactly is the format of your date field? Is it still the millisecond timestamp? Could you please include some examples in here just to clarify things?

My date field is stored as a Date type (Date mapping) in Elasticsearch. For example, a document is:
{
"uuid": "6ea7bb7d-95c5-4e4d-9649-8a80021049a2", <- String Mapping in ES
"timestamp": "2016-11-29T20:08:00.775Z", <- has a Date mapping in ES
... other fields ...
}

Well if you're simply using something like Kibana to query the data, you'll already see an ISO8601 compliant UTC date/time in the "@timestamp" field. I guess I just don't understand your use case on why you'd need to do this extra conversion in logstash on each event if you already have a valid date field in it?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.