Hi.
How can i configure logstash to have @timestamp and message time equal.
My Config is:
input {
beats {
port => 5044
}
}
filter {
if [type] == "syslog" {
grok {
match => {
"message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}"
}
add_field => [ "received_at", "%{@timestamp}" ]
add_field => [ "received_from", "%{host}" ]
}
syslog_pri { }
date {
match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
}
}
output {
elasticsearch {
hosts => "localhost:9200"
index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
}
}
A_B
January 29, 2020, 2:07pm
2
Hi @faridmmv ,
I'm pretty sure your logs show _dateparsefailure
tags. The date format of syslog_timestamp
doesn't match the date formats you list. Try something like this for the date format
date {
match => [ "syslog_timestamp", "dd-MM-yyyy HH:mm:ss.SSS" ]
}
If I got the time format wrong, check the documentation here .
Hi,
Thanks for the reply. Actually there was no error, anyway Ive changed the line to
match => [ "syslog_timestamp", "dd-MM-yyyy HH:mm:ss.SSS" ]`
But this did not solve timestamp and message time mismatch. Any idea how I can fix this?
Badger
January 29, 2020, 5:50pm
4
Can you show us a complete event copy and pasted from the JSON tab in kibana / Discover?
{
"_index": "filebeat-2020.01.29",
"_type": "_doc",
"_id": "YcJi7b7BHduoqUKWDS2v",
"_version": 1,
"_score": null,
"_source": {
"@timestamp": "2020-01-29T17:38:16.968Z",
"message": "29-01-2020 21:38:12.170 [http-nio-8000-exec-499] DEBUG a.e.k.p.aop.logging.LoggingAspect.logAround -> Exit: privatedata.service.person.FinService.getPersonByIdentity() with result = ***SOME REMOVED DATA***,
"@version": "1",
"host": {
"containerized": false,
"os": {
"family": "redhat",
"version": "7 (Core)",
"kernel": "3.10.0-957.27.2.el7.x86_64",
"platform": "centos",
"codename": "Core",
"name": "CentOS Linux"
},
"id": "3c0265f0dd654cc5hgfdd8757f3b8a",
"hostname": "server-3",
"architecture": "x86_64",
"name": "server-3"
},
"tags": [
"beats_input_codec_plain_applied"
],
"log": {
"file": {
"path": "/var/log/java/customService.log"
},
"offset": 9588320
},
"agent": {
"hostname": "server-3",
"id": "4a79fbc7-968b-4cd3-a2a8-7525679fgdfk09",
"version": "7.5.1",
"ephemeral_id": "056f95ede-ae45-40a9-beab-a795638bc084",
"type": "filebeat"
},
"ecs": {
"version": "1.1.0"
},
"input": {
"type": "log"
}
},
"fields": {
"@timestamp": [
"2020-01-29T17:38:16.968Z"
]
},
"sort": [
15803194047695
]
}
Badger
January 29, 2020, 8:45pm
6
That does not contain a syslog_timestamp field. The date filter is a no-op if the source field does not exist.
Badger
January 30, 2020, 1:12am
8
You made the grok conditional upon the value of the [type] being syslog, but the [type] field does not exist either, so the grok filter never gets applied.
I kinda understood, this config was taken from internet as part of configuration.
Could you please suggest configuration to achieve same time in @timestamp and message?
Thanks
I tried this config. But issue still persists
input {
beats {
port => 5044
}
}
filter {
grok {
match => ["message", "%{TIMESTAMP_ISO8601:timestamp}"]
}
date {
match => ["timestamp", "ISO8601"]
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
}
}
Badger
January 30, 2020, 5:52pm
11
That date/time does not match TIMESTAMP_ISO8601, which has to be year followed by month followed by day.
grok { match => ["message", "%{DATESTAMP:timestamp}"] }
will work. Your date filter also needs to change since, as I said, it's not in ISO8601 format.
1 Like
Config below worked:
input {
beats {
port => 5044
}
}
filter {
grok {
match => ["message", "%{DATESTAMP:timestamp}"]
}
date {
match => ["timestamp" , "dd-MM-yyyy HH:mm:ss.SSS"]
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "%{[@metadata ][beat]}-%{+dd-MM-yyyy}"
}
}
Thanks for help @Badger
system
(system)
Closed
February 28, 2020, 5:37am
13
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.