Recommended way to force date fields to be UTC

Hello,

using LogStash and ElasticSearch 5.6.10.

My documents have a few fields of type "date", besides @timestamp.
The template includes, for each one of them, a section like this

"my_datefield_1": {
	"type": "date",
	"format": "MM/dd/yy HH:mm:ss||MM/dd/yy HH:mm:ss.SSS"
},

I have just discovered that ElasticSearch expects them to be in UTC. As they are not, they are in local time, when I visualize the data with Kibana, all dates are shifted 4 hours -the different between my local time and UTC-.

So I was wondering what is the best way to convert the content of those fields to be in UTC when dumping the data into ElasticSearch:

  1. Should I use the LogStash Filter plugin "Alter" to change their values? Maybe the Ruby plugin?
  2. Is there a way to use the LogStash Filter plugin "Date" to change them from local to UTC?
  3. Or is it actually possible to do it thru the ElasticSearch Template, telling it that the dates are coming in Local Time format and letting ElasticSearch to do the conversion to UTC itself?

I don't feel 2. and 3. are actually possible, and I should go for 1.
But no harm in asking, in case I am missing something.

Thanks a lot in advance,
Jose

That is the function of the date filter.

Hi @Badger, thanks a lot for such a prompt response.

I read this in the documentation

The date filter is used for parsing dates from fields, 
and then using that date or timestamp 
as the logstash timestamp for the event.

The part I am concerned is "and then using that date or timestamp as the logstash timestamp for the event."
I already have another field acting as timestamp.
I don't want any of the new ones, after converting them to UTC using filter plugin Date, to suddenly become the timestamp.
Or that can be avoid simply by using option target?

Would something like this work?

filter{
    date {
        match => ["my_datefield_1", "MM/dd/yy HH:mm:ss.SSS", "MM/dd/yy HH:mm:ss"]
        target => ["my_datefield_1"]
    }
}

Yes, and yes.

In that case, the documentation is a little bit misleading, as it makes you believe that my_datefield_1 would become the new timestamp for the document in ElasticSearch...

One thing I should mention is that the target of the date filter is of type LogStash::Timestamp

"my_datefield_1" => 2019-08-01T20:55:48.719Z,

So elasticsearch will (perhaps once you roll to a new index) store it as a date type, and Kibana will give you the option (using a dropdown, if I recall correctly) of using it as a timestamp, but will default to using @timestamp.

Just one final question. I am not sure how to handle the transition.

As I mentioned, the current Template for ElasticSearch includes a section like

"my_datefield_1": {
	"type": "date",
	"format": "MM/dd/yy HH:mm:ss||MM/dd/yy HH:mm:ss.SSS"
},

That works currently since "my_datefield_1", as created by LogStash, is a string with one of those formats.

Once I start using the new LogStash configuration, that converts "my_datefield_1" into a LogStash::Timestamp type, I will need a new Template, without the format, right?

"my_datefield_1": {
	"type": "date"
},

But the new Template will start being used when a new index is created. As usual, it will be the next day. Indices are like "my_index-<YYYY.MM.DD>

I am not sure how to make ElasticSearch to accept both the old type (string with a given format) and new type (timestamp) for "my_datefield_1" during the day of the transition. Is it possible?

I do not know. That is an elasticsearch question, not a logstash question.

:slight_smile: true.
I will post the question at the ES forum.

Thanks a lot for everything !!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.