@timestamp one day earlier than Event timestamp

strong text
@timestamp one day earlier (20 hours) than Event timestamp

Kibana is displaying the Document in CDT, the Event received is in EDT and @timestamp is in UTC.
When Logstash collects the message below, the time is "Tue Mar 21 11:40:00 EDT 2018".
Rubydebug displays the field timestamp as "timestamp" => "Tue Mar 21 11:40:00 EDT 2018".
The Kibana Document displays the field timestamp as "? timestamp Tue Mar 21 11:40:00 EDT 2018"
The timestamp field complains "No cached mapping for this field. Refresh field list from Management > Index Patterns page."
The timestamp field is not listed on the Index Patterns page and no amount of refreshing will make it appear.
The problem in question it that the @timestamp field displays a date 20 hours earlier than the Event occurred, "@timestamp" => "2018-03-20T15:40:00.000Z".
Because of the strange date format the date.pattern was created to parse the timestamp and is listed below.
I know you have helped others. Maybe you can decipher this one.

message:
<><><><> [Tue Mar 21 11:40:00 EDT 2018] AUDITLOGS Backup Success.5-11:40

date.pattern file:
AUDIT_MONTHDAY (?:(?:[0 ][1-9])|(?:[12][0-9])|(?:3[01])|[1-9])
DATE_AUDIT %{DAY}%{SPACE}%{MONTH}%{SPACE}%{AUDIT_MONTHDAY}%{SPACE}%{TIME}%{SPACE}%{TZ}%{SPACE}%{YEAR}
DATE_TZ %{DATESTAMP:timestamp}%{SPACE}%{TZ}

Filter:
if "_grokparsefailure" in [tags] {
grok {
patterns_dir => ["/home/dplrgid8/logstash/dev/bin/patterns"]
match => { "message" => "^<><><><>%{SPACE}[%{DATE_AUDIT:timestamp}]%{SPACE}%{GREEDYDATA:narrative}" }
remove_tag => ["_grokparsefailure"]
add_field => [ "tags", "grok25098" ]
}
date {
match => [ "timestamp", "EEE MMM dd HH:mm:ss zzz yyyy" ]
}
}

Rubydebug Output:

{
"@timestamp" => "2018-03-20T15:40:00.000Z",
"message" => "<><><><> [Tue Mar 21 11:40:00 EDT 2018] AUDITLOGS Backup Success.5-11:40 ",
"@version" => "1",
"path" => "/home/dplrgid8/logstash/data360/logs/Backup_Audit.log",
"host" => "FQDN",
"type" => "Backup_Audit",
"application" => "Data_360_-_Find",
"component" => "MARIA_DB",
"environment" => "qa",
"logstash_version" => "2.4.0",
"filter_date" => "2018/03/21 10:38:36 CDT",
"geoip" => {
"location" => {
"lat" => "LAT",
"lon" => "-LON"
}
},
"tags" => [
[0] "grok25098"
],
"timestamp" => "Tue Mar 21 11:40:00 EDT 2018",
"narrative" => "AUDITLOGS Backup Success.5-11:40 "
}

Kibana Document:
March 20th 2018, 10:40:00.000 Data_360_-_Find
MARIA_DB
FQDN
- - AUDITLOGS Backup Success.5-11:40
Backup_Audit
Link to /dplr-qa-logstash-2018.03.20/Backup_Audit/AWJJOc3GNltr8GYV7Vm7

Table
JSON

@timestamp March 20th 2018, 10:40:00.000
t @version 1
t _id AWJJOc3GNltr8GYV7Vm7
t _index dplr-qa-logstash-2018.03.20

_score -

t type Backup_Audit
t application Data_360
-_Find
t component MARIA_DB
t environment qa
t filter_date 2018/03/21 10:38:36 CDT
geoip.location {
"lat": "LAT",
"lon": "-LON"
}
t host FQDN
t logstash_version 2.4.0
t message <><><><> [Tue Mar 21 11:40:00 EDT 2018] AUDITLOGS Backup Success.5-11:40
t narrative AUDITLOGS Backup Success.5-11:40
t path /home/dplrgid8/logstash/data360/logs/Backup_Audit.log
t tags grok25098
? timestamp Tue Mar 21 11:40:00 EDT 2018
t type Backup_Audit

Hmm. According to the documentation, timezone names like EDT can't be parsed (they're ambiguous) so I'm a bit surprised you're getting anything at all.

March 21st is a Wednesday. Setting the EEE to Tue is forcing it back to the 20th.

It appears to interpret EDT as Etc/GMT+4. If that's not the right EDT, then a mutate+gsub could switch in the right timezone identification (which might require using ZZZ rather than zzz).

3/22/2018
Sorry for the slow response but I think I have a solution based on your fabulous input.
I broke the DATE_AUDIT patten down to it's component parts. Substituted the EDT for the Java Time Zone ID, US/Eastern and then built the time stamp using add_field, leaving out the %{day} component.
Changed the timestamp match to ZZZ, as you recommended.

            if "_grokparsefailure" in [tags] {
                    grok {
                            patterns_dir => ["/home/dplrgid8/logstash/dev/bin/patterns"]
                            ### match => { "message" => "^<><><><>%{SPACE}\[%{DATE_AUDIT:timestamp}\]%{SPACE}%{GREEDYDATA:narrative}" }
                            match => { "message" => "^<><><><>%{SPACE}\[%{DAY:day}%{SPACE}%{MONTH:month}%{SPACE}%{AUDIT_MONTHDAY:monthday}%{SPACE}%{TIME:time}%{SPACE}%{TZ:tz}%{SPACE}%{YEAR:year}\]%{SPACE}%{GREEDYDATA:narrative}" }
                            remove_tag => ["_grokparsefailure"]
                            add_field => [ "tags", "grok25397" ]
                    }

                    mutate {
                            gsub => [
                                    # Replace Time Zone with Java Time Zone ID
                                    "tz", "EDT", "US/Eastern",
                                    "tz", "EST", "US/Eastern",
                                    "tz", "CDT", "US/Central",
                                    "tz", "CST", "US/Central",
                                    "tz", "MDT", "US/Mountain",
                                    "tz", "MST", "US/Mountain",
                                    "tz", "PDT", "US/Pacific",
                                    "tz", "PST", "US/Pacific"
                            ]

                            add_field => { "timestamp" => "%{month} %{monthday} %{time} %{tz} %{year}" }
                    }

                    date {
                            match => [ "timestamp", "MMM dd HH:mm:ss ZZZ yyyy" ]
                    }
            }

Rubydebug Output:

Pipeline main started
{
      "@timestamp" => "2018-03-21T16:00:00.000Z",
         "message" => "<><><><> [Wed Mar 21 12:00:00 EDT 2018] DUMP Backup Success Mar 21 12:00:00 US/Eastern 2018",
        "@version" => "1",
            "path" => "/home/dplrgid8/logstash/data360/logs/Backup_DUMP.log",
            "host" => "FQDN",
            "type" => "Backup_DUMP",
     "application" => "Data_360_-_Find",
       "component" => "MARIA_DB",
     "environment" => "qa",
"logstash_version" => "2.4.0",
     "filter_date" => "2018/03/21 16:47:49 CDT",
           "geoip" => {
    "location" => {
        "lat" => "37.66",
        "lon" => "-122.096839"
    }
  },
            "tags" => [
    [0] "grok25397"
  ],
             "day" => "Wed",
           "month" => "Mar",
        "monthday" => "21",
            "time" => "12:00:00",
              "tz" => "US/Eastern",
            "year" => "2018",
       "narrative" => "DUMP Backup Success Mar 21 12:00:00 US/Eastern 2018",
       "timestamp" => "Mar 21 12:00:00 US/Eastern 2018"
}

In Kibana @timestamp display in CDT, 11 AM in this case, is the correct date and time:

March 21st 2018, 11:00:00.000	Data_360_-_Find
MARIA_DB	FQDN
 - 	 - 	DUMP Backup Success Mar 21 12:00:00 US/Eastern 2018
Backup_DUMP
Link to /dplr-qa-logstash-2018.03.21/Backup_DUMP/AWJOd3BfdxJQCSK05ofa

Table
JSON

@timestamp 		March 21st 2018, 11:00:00.000
t @version 		1
t _id 		AWJOd3BfdxJQCSK05ofa
t _index 		dplr-qa-logstash-2018.03.21
 _score 		 - 
t _type 		Backup_DUMP
t application 		Data_360_-_Find
t component 		MARIA_DB
? day 		Wed
t environment 		qa
t filter_date 		2018/03/21 16:47:49 CDT
geoip.location 		{
  "lat": "37.66",
  "lon": "-122.096839"
}
t host 		FQDN
t logstash_version 		2.4.0
t message 		<><><><> [Wed Mar 21 12:00:00 EDT 2018] DUMP Backup Success Mar 21 12:00:00 US/Eastern 2018
? month 		Mar
? monthday 		21
t narrative 		DUMP Backup Success Mar 21 12:00:00 US/Eastern 2018
t path 		/home/dplrgid8/logstash/data360/logs/Backup_DUMP.log
t tags 		grok25397
? time 		12:00:00
? timestamp 		Mar 21 12:00:00 US/Eastern 2018
t type 		Backup_DUMP
? tz 		US/Eastern
? year 		2018

Thanks for kick starting my brain. I never would have thought about dismantling the date's component parts if you had not mentioned it.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.