Set Logdate as @timestamp using logstash

Hi Everyone,

I am trying to parse a nginx log file and want to set logdate as @timestamp. I am using the date filter plugin for this.

Here's a sample date from my log file:
03/Apr/2019:16:33:03 +0530

The date plugin in my filter looks like this:
date {
match => ["date", "dd/MMM/yyyy:HH:mm:ss Z"]
}

I am not getting any dateparsefailure errors but the @timestamp field is still in UTC format and not equal to the log date.

I tried this too:
date {
match => ["date", "dd/MMM/YYYY:HH:mm:ss Z"]
}

And this:
date {
match => ["date", "dd/MMM/yyyy:HH:mm:ss Z"]
target => "@timestamp"
}

Please help!

Vaibhav

elasticsearch always saves dates as UTC. And @timestamp will always be formatted the same way

"@timestamp" => 2019-04-03T14:41:43.295Z

If you want the parsed date to be written to a field other than @timestamp then use the target option. If "date" is not being parsed to @timestamp (in UTC) and you are not getting a _dateparsefailure tags then "date" does not exist.

What does the relavent fields on one of your events look like (use stdout { codec => rubydebug } or the JSON tab in Kibana) and what do you want them to look like.

Hi Here is one of the sample Documents that I have in the index:

{
"_index": "online-nginx-logs-2019.04.03",
"_type": "doc",
"_id": "mV8s5GkBIUXkHEbLdn0Q",
"_version": 1,
"_score": null,
"_source": {
"referer": "https://www.moglix.com/ngsw-worker.js ",
"geoip": {
"country_code2": "IN",
"country_code3": "IN",
"ip": "157.41.216.197",
"continent_code": "AS",
"timezone": "Asia/Kolkata",
"latitude": 20,
"location": {
"lat": 20,
"lon": 77
},
"country_name": "India",
"longitude": 77
},
"clientip": "157.41.216.197",
"source": "/var/log/nginx/moglix.access.log",
"request_time": 0,
"action": "GET",
"upstream_port": "80",
"user_agent": {
"os_minor": "1",
"device": "Generic Smartphone",
"os": "Android",
"os_name": "Android",
"build": "",
"os_major": "8",
"patch": "3683",
"major": "73",
"name": "Chrome Mobile",
"minor": "0"
},
"request_method": "GET",
"host": {
"name": "lb01"
},
"bytes_sent": 22921,
"upstream_status": 200,
"prospector": {
"type": "log"
},
"@timestamp": "2019-04-03T17:08:31.000Z",
"log_type": "nginx_access",
"API": "/54.616e8c6f032ce555ab44.js",
"input": {
"type": "log"
},
"offset": 707418872,
"module_name": "online",
"upstream_address": "10.0.3.182",
"upstream_response_time": 0.004,
"upstream_connect_time": 0.004,
"tags": [
"beats_input_codec_plain_applied"
],
"date": "03/Apr/2019:22:38:31 +0530",
"beat": {
"version": "6.5.4",
"hostname": "lb01",
"name": "lb01"
},
"@version": "1",
"source_type": "nginx",
"request_length": 51,
"upstream_header_time": 0.004,
"status": 200,
"http_version": "2.0"
},
"fields": {
"@timestamp": [
"2019-04-03T17:08:31.000Z"
]
},
"sort": [
1554311311000
]
}

As you can I see there's a difference of +0530 in the "date" field and the "@timestamp" field. I want the timestamp field to hold the value of the "date" field which I am parsing through the date filter.

The need arises because I want to all the logs of one day in one particular index. Is there something I am missing in the date filter?

Your date field specifies that it is five and half hours ahead of UTC. elasticsearch always stores dates as UTC. If you remove the +0530 using mutate+gsub, the date filter will assume it is UTC.

Not sure why you would care about getting the data for one day (local time) into one index. Downstream (e.g. Kibana) will adjust the browser time to UTC before doing time-based queries.

Hi,

I tried to parse the date field in a separate field "log_timestamp" and modified my date filter slightly and also added the mutate+gsub.
So together they look like this:

mutate {
gsub => ["date", " [+-][0-9][0-9][0-9][0-9]", ""]
}
date {
match => [ "date", "dd/MMM/yyyy:HH:mm:ss" ]
timezone => "Asia/Kolkata"
target => "log_timestamp"
}

So now my date field is: 03/Apr/2019:22:38:31

But still the even the log_timestamp field is: 2019-04-03T17:08:31.000Z

I even added the timezone parameter like this:

timezone => "Asia/Kolkata"

Still didnt work.

We have a separate mailing service that will read data directly from current day's elasticsearch index and send out mailing alerts based on custom rules that can be configured.

Add

timezone => "UTC"

to the date filter.

1 Like

Okay, Thanks Badger. This fixes my problem.

For anybody else who runs into the same problem here's what worked for me:

mutate {
gsub => ["date", " [+-][0-9][0-9][0-9][0-9]", ""]
}
date {
match => [ "date", "dd/MMM/yyyy:HH:mm:ss" ]
timezone => "UTC"
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.