New timestamp using dynamic timezone not working

I am trying to add a field called localtimestamp to match the UNIX timestamp received in the json message:

{"event": 1,"content": ["test","value1","value2","value3","value4"],"timestamp": 1502844683}

This is my filter in logstash:

date {
        match => [ "timestamp","UNIX" ]
        timezone => "%{timezone}"
        target => "localtimestamp"
    }

Timezone is "timezone" => "America/Vancouver"

I was expecting timestamp 1502844683 (GMT Wednesday, August 16, 2017 12:51:23 AM) to be converted to localtimestamp Wednesday, August 15, 2017 09:51:23 PM but I get this:

"localtimestamp" => 2017-08-16T00:51:23.000Z. --> appears to be GMT time
"@timestamp" => 2017-08-16T01:46:00.230Z. --> appears to be local UTC time

Any help would be appreciated.

Unix timestamps (i.e. seconds since the epoch) are by definition always UTC and @timestamp is also always UTC. The timezone option indicates the timezone of the source timestamp, but doesn't really apply when the UNIX or UNIX_MS patterns are used.

I tried this unsuccessfully. Any chance you could guide me to a solution so that I can have a new field with a converted timestamp?

 # "@timestamp": "2017-08-16T01:17:09.689Z"
  mutate {
        add_field => {
            # Create a new field with string value of the UTC event date
            "localtimestamp" => "%{@timestamp}"
        }
    }

      date {
        match => [ "localtimestamp","yyyy-MM-dd'T'HH:mm:ss", "ISO8601" ]
        timezone => "%{timezone}"
        target => "localtimestamp"
    }

The date filter always produces a UTC timestamp. If you want a timestamp in the local timezone you'll have to use a ruby filter with some custom code.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.