ThePreMan
(The Pre Man)
September 10, 2020, 2:05pm
1
Hi, I am new to the elastic stack. I am currently having trouble setting the timestamp within the logstash config.
The date format to be read:
2019/06/19 10:05:37.000000
19046 2019/06/19 10:07:11.000000 3893143 075 OTAF DANL HADT log info V 1 [11 40 41 00 41 00 00 00 c2 f0 05 00 c4 01 00 00 00 00 00 00 00 00 d1 00 5b 50 45 52 53 5d 5b 6f 74 61 2d 66 6d 23 6a 6f 62 73 5d 20 44 61 74 61 62 61 73 65 20 61 6c 72 65 61 64 79 20 49 4e 49 54 49 41 4c 49 5a 45 44 00]
My config:
filter {
date {
match => [ "logdate", "yyyy/MM/dd HH:mm:ss", "ISO8601" ]
}
}
Thanks
Badger
September 10, 2020, 2:50pm
2
You need to match the entire string, so it should be
"yyyy/MM/dd HH:mm:ss.SSSSSS"
ThePreMan
(The Pre Man)
September 10, 2020, 2:56pm
3
Would this be correct ?
filter {
date {
match => [ "timestamp", "yyyy/MM/dd HH:mm:ss.SSSSSSS", "ISO8601" ]
}
}
As far as I understand it should replace the timestamp inside elasticsearch and kibana with 2019/06/...?
Badger
September 10, 2020, 3:25pm
6
ThePreMan:
SSSSSSS
That has seven S's, it should be six.
ThePreMan
(The Pre Man)
September 10, 2020, 3:51pm
7
ThePreMan:
SSSS
Still no changes. Just to be sure:
GNU nano 4.8 logstash.conf input {
input {
beats {
port => 5044
client_inactivity_timeout => 84600
}
}
filter {
date {
match => [ "timestamp", "yyyy/MM/dd HH:mm:ss.SSSSSS", "ISO8601" ]
}
}
output {
# Receiving application - here Elasticsearch
elasticsearch {
# hostname/ip and port
hosts => ["elk:9200"]
# index of the receiving messages
index => "fb"
}
}
Badger
September 10, 2020, 3:57pm
8
If you use
output { stdout { codec => rubydebug } }
what do the timestamp and @timestamp fields look like?
ThePreMan
(The Pre Man)
September 10, 2020, 4:08pm
10
log | {
log | "@timestamp" => 2020-09-10T16:05:43.927Z,
log | "agent" => {
log | "ephemeral_id" => "3da64893-dfa1-4043-811c-01dfc36a8aac",
log | "hostname" => "67b4e0dbbe13",
log | "id" => "5a9b3ef2-f5e4-4f8e-a9a6-c5d4aef288d2",
log | "version" => "7.2.0",
log | "type" => "filebeat"
log | },
log | "input" => {
log | "type" => "log"
log | },
log | "@version" => "1",
log | "host" => {
log | "name" => "67b4e0dbbe13"
log | },
log | "message" => "2863 2019/06/19 09:41:39.000000 13042924 021 OTAF KSBT mock log info V 1 [2019-06-19 09:31:21.599 000000000000000C D Platform: >> RealtimeClock.getTime]",
log | "log" => {
log | "offset" => 574435,
log | "file" => {
log | "path" => "/usr/share/filebeat/data/dlt-c.log"
log | }
log | },
log | "tags" => [
log | [0] "beats_input_codec_plain_applied"
log | ],
log | "ecs" => {
log | "version" => "1.0.0"
log | }
log | }
Badger
September 10, 2020, 4:21pm
11
There is no [timestamp] field. You need to extract it from the message. You could do that with
dissect { mapping => { "message" => "%{} %{timestamp} %{+timestamp} %{}" } }
1 Like
ThePreMan
(The Pre Man)
September 10, 2020, 4:24pm
12
Is there any additional config needed ?
logstash can´t read that
import { ... }
dissect {
mapping => { "message" => "%{} %{timestamp} %{+timestamp} %{}" }
}
filter { ... }
output { ... }
Badger
September 10, 2020, 4:31pm
13
It needs to be inside the filter {} section.
ThePreMan
(The Pre Man)
September 10, 2020, 4:50pm
14
Thank you so much it is working now
volba
(Bart V)
September 16, 2020, 11:07am
15
@Badger Badger - would this work with filebeat as well? I need to setup a simple SQL log parse, and same issue - getting event date/time out and indexed using that timestamp.
I have no filter setup in my filebeat config currently, just sending full line of text.
@ThePreMan - could you post the final config from the filter section, please?
ThePreMan
(The Pre Man)
September 21, 2020, 6:23pm
16
input {
beats {
port => 5044
client_inactivity_timeout => 84600
}
}
filter {
dissect {
mapping => {
"message" => "%{} %{Time} %{+Time} %{}"
}
}
date {
match => [ "Time", "yyyy/MM/dd HH:mm:ss.SSSSSS" ]
remove_field => ["Time"]
}
}
1 Like
system
(system)
Closed
October 19, 2020, 6:23pm
17
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.