Hi,
I am using ELK GA 5.0.0. I am consuming from Kafka topic using Logstash. Messages are JSON encoded. In the message, I have date field like \t[05/Feb/2018:10:39:47 +0000]\t
. In the grok filter, I am parsing it like;
\t[(?<timestamp>%{MONTHDAY}/%{MONTH}/20%{YEAR}:%{HOUR}:?%{MINUTE}:(?::?%{SECOND}) Z)]\t
It created _grokparsefailure
error. To debug the issue, I tried;
\t[%{GREEDYDATA:time}]\t
It also created the same issue. Again, I tried;
\t\[%{GREEDYDATA:time}\]\t
and
\t\\[%{GREEDYDATA:time}\\]\t
Still no use. Finally, I tried;
\t%{GREEDYDATA:time}\t
This worked. So the issue is with [
and ]
. How can I properly escape this and fix the issue to use my first timestamp solution?
Thank you.
pjanzen
(Paul Janzen)
March 13, 2018, 12:47pm
2
This seems to work for me.
\\t\[%{GREEDYDATA:timestamp}\]\\t
on http://grokdebug.herokuapp.com/ it gives me the below result.
{
"timestamp": [
[
"05/Feb/2018:10:39:47 +0000"
]
]
}
elasticheart:
\t[%{GREEDYDATA:time}]\t
Hi, its a valid pattern on the site, but still same issue in logstash.
pjanzen
(Paul Janzen)
March 13, 2018, 12:56pm
4
Can you share you complete logstash filter? Please use the </> formatting when you paste text.
grok{
match => { "message" => "%{NOTSPACE:f1}\t%{NOTSPACE:f2}\t%{NOTSPACE:f3}\t[(?<timestamp>%{MONTHDAY}/%{MONTH}/20%{YEAR}:%{HOUR}:?%{MINUTE}:(?::?%{SECOND}) Z)]\t\"%{GREEDYDATA:f4}\"\t%{NOTSPACE:f5}\t%{NOTSPACE:f6}\t%{NOTSPACE:f7}\t\"%{GREEDYDATA:f8}\"\t\"%{GREEDYDATA:f9}\"\t\"%{GREEDYDATA:f10}\"\t\"%{GREEDYDATA:f11}\"\t%{NOTSPACE:f12}\t%{NOTSPACE:f13}" }
}
pjanzen
(Paul Janzen)
March 13, 2018, 1:04pm
6
Thanks, and an example logline?
pjanzen
(Paul Janzen)
March 13, 2018, 1:08pm
7
I did a quick test and came up with this.
Input data (assuming the \t is a tab in the input file)
pjanzen@logstash1:~$ cat /home/pjanzen/input.txt
[05/Feb/2018:10:39:47 +0000]
pjanzen@logstash1:~$
input {
file {
path => "/home/pjanzen/input.txt"
sincedb_path => "/dev/null"
start_position => "beginning"
}
}
filter {
grok {
match => { "message" => "\t\[%{GREEDYDATA:timestamp}\]\t" }
}
}
output {
stdout { codec => rubydebug }
}
The result is this.
pjanzen@logstash1:~$ sudo /usr/share/logstash/bin/logstash --path.settings=/etc/logstash -f /home/pjanzen/test.conf
Sending Logstash's logs to /opt/logstash/logs which is now configured via log4j2.properties
{
"path" => "/home/pjanzen/input.txt",
"@timestamp" => 2018-03-13T13:06:30.315Z,
"@version" => "1",
"host" => "logstash1",
"message" => "\t[05/Feb/2018:10:39:47 +0000]\t",
"timestamp" => "05/Feb/2018:10:39:47 +0000"
}
I hope this helps.
system
(system)
Closed
April 10, 2018, 1:10pm
8
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.