strong text
@timestamp one day earlier (20 hours) than Event timestamp
Kibana is displaying the Document in CDT, the Event received is in EDT and @timestamp is in UTC.
When Logstash collects the message below, the time is "Tue Mar 21 11:40:00 EDT 2018".
Rubydebug displays the field timestamp as "timestamp" => "Tue Mar 21 11:40:00 EDT 2018".
The Kibana Document displays the field timestamp as "? timestamp Tue Mar 21 11:40:00 EDT 2018"
The timestamp field complains "No cached mapping for this field. Refresh field list from Management > Index Patterns page."
The timestamp field is not listed on the Index Patterns page and no amount of refreshing will make it appear.
The problem in question it that the @timestamp field displays a date 20 hours earlier than the Event occurred, "@timestamp" => "2018-03-20T15:40:00.000Z".
Because of the strange date format the date.pattern was created to parse the timestamp and is listed below.
I know you have helped others. Maybe you can decipher this one.
message:
<><><><> [Tue Mar 21 11:40:00 EDT 2018] AUDITLOGS Backup Success.5-11:40
date.pattern file:
AUDIT_MONTHDAY (?:(?:[0 ][1-9])|(?:[12][0-9])|(?:3[01])|[1-9])
DATE_AUDIT %{DAY}%{SPACE}%{MONTH}%{SPACE}%{AUDIT_MONTHDAY}%{SPACE}%{TIME}%{SPACE}%{TZ}%{SPACE}%{YEAR}
DATE_TZ %{DATESTAMP:timestamp}%{SPACE}%{TZ}
Filter:
if "_grokparsefailure" in [tags] {
grok {
patterns_dir => ["/home/dplrgid8/logstash/dev/bin/patterns"]
match => { "message" => "^<><><><>%{SPACE}[%{DATE_AUDIT:timestamp}]%{SPACE}%{GREEDYDATA:narrative}" }
remove_tag => ["_grokparsefailure"]
add_field => [ "tags", "grok25098" ]
}
date {
match => [ "timestamp", "EEE MMM dd HH:mm:ss zzz yyyy" ]
}
}
Rubydebug Output:
{
"@timestamp" => "2018-03-20T15:40:00.000Z",
"message" => "<><><><> [Tue Mar 21 11:40:00 EDT 2018] AUDITLOGS Backup Success.5-11:40 ",
"@version" => "1",
"path" => "/home/dplrgid8/logstash/data360/logs/Backup_Audit.log",
"host" => "FQDN",
"type" => "Backup_Audit",
"application" => "Data_360_-_Find",
"component" => "MARIA_DB",
"environment" => "qa",
"logstash_version" => "2.4.0",
"filter_date" => "2018/03/21 10:38:36 CDT",
"geoip" => {
"location" => {
"lat" => "LAT",
"lon" => "-LON"
}
},
"tags" => [
[0] "grok25098"
],
"timestamp" => "Tue Mar 21 11:40:00 EDT 2018",
"narrative" => "AUDITLOGS Backup Success.5-11:40 "
}
Kibana Document:
March 20th 2018, 10:40:00.000 Data_360_-_Find
MARIA_DB
FQDN
- - AUDITLOGS Backup Success.5-11:40
Backup_Audit
Link to /dplr-qa-logstash-2018.03.20/Backup_Audit/AWJJOc3GNltr8GYV7Vm7
Table
JSON
@timestamp March 20th 2018, 10:40:00.000
t @version 1
t _id AWJJOc3GNltr8GYV7Vm7
t _index dplr-qa-logstash-2018.03.20
_score -
t type Backup_Audit
t application Data_360-_Find
t component MARIA_DB
t environment qa
t filter_date 2018/03/21 10:38:36 CDT
geoip.location {
"lat": "LAT",
"lon": "-LON"
}
t host FQDN
t logstash_version 2.4.0
t message <><><><> [Tue Mar 21 11:40:00 EDT 2018] AUDITLOGS Backup Success.5-11:40
t narrative AUDITLOGS Backup Success.5-11:40
t path /home/dplrgid8/logstash/data360/logs/Backup_Audit.log
t tags grok25098
? timestamp Tue Mar 21 11:40:00 EDT 2018
t type Backup_Audit