blenolopes
(Bleno Lopes)
January 14, 2020, 7:49pm
1
Hello logstashers,
I have an "atypical" problem with a particular log format using date filter. Follow the tests.
bin/logstash --version
logstash 7.5.0
Success
~$ echo "14/01/2020 16:03:00,143" | bin/logstash -e 'input { stdin {} } filter { date { match => [ "message", "dd/MM/yyyy HH:mm:ss,SSS" ] } }'
{
"message" => "14/01/2020 16:03:00,143",
"@version " => "1",
"@timestamp " => 2020-01-14T19:03:00.143Z,
"host" => "myhost"
}
Failure
~$ echo "[14/01/2020 16:03:00,143 XXX] [XXX] MSG" | bin/logstash -e 'input { stdin {} } filter { date { match => [ "message", "dd/MM/yyyy HH:mm:ss,SSS" ] } }'
{
"@timestamp " => 2020-01-14T19:42:51.585Z,
"@version " => "1",
"message" => "[14/01/2020 16:03:00,143 XXX] [XXX] MSG",
"tags" => [
[0] "_dateparsefailure"
],
"host" => "myhost"
}
Where am i going wrong?
Grateful for the help and attention.
Badger
January 14, 2020, 8:10pm
2
blenolopes:
Where am i going wrong?
The pattern in the date filter has to match the entire field. Your [message] field in the second example includes much more stuff that your pattern does not match. I would use dissect to extract the date from [message]
dissect { mapping => { "message" => "[%{[@metadata][timestamp]} %{+%{[@metadata][timestamp]} %{}" } }
date { match => [ "[@metadata][timestamp]", "dd/MM/yyyy HH:mm:ss,SSS" ] }
1 Like
blenolopes
(Bleno Lopes)
January 15, 2020, 4:31pm
3
Hello Badger,
Thanks for the answer. But I still have the problem. Testing logstash generates an exception because it cannot identify the field passed in mapping dissect.
[ERROR] 2020-01-15 13:20:38.686 [[main]>worker0] Dissector - Dissect threw an exception {exception=org.logstash.FieldReference$IllegalSyntaxException: Invalid FieldReference: `%{[@metadata][timestamp]`, backtrace=org.logstash.FieldReference$StrictTokenizer.tokenize(FieldReference.java:283)
org.logstash.FieldReference.parse(FieldReference.java:184)
org.logstash.FieldReference.parseToCache(FieldReference.java:175)
org.logstash.FieldReference.from(FieldReference.java:107)
org.logstash.Event.includes(Event.java:194)
org.logstash.dissect.fields.AppendField.append(AppendField.java:42)
org.logstash.dissect.Dissector.dissect(Dissector.java:142)
org.logstash.dissect.JavaDissectorLibrary$RubyDissect.invokeDissection(JavaDissectorLibrary.java:180)
org.logstash.dissect.JavaDissectorLibrary$RubyDissect.dissect(JavaDissectorLibrary.java:129)
org.logstash.dissect.JavaDissectorLibrary$RubyDissect.lambda$dissectMulti$0(JavaDissectorLibrary.java:148)
java.base/java.util.Spliterators$ArraySpliterator.forEachRemaining(Spliterators.java:948)
java.base/java.util.stream.ReferencePipeline$Head.forEach(ReferencePipeline.java:658)}
{
"@timestamp" => 2020-01-15T16:20:37.160Z,
"@version" => "1",
"message" => "[14/01/2020 16:03:00,143 XXX] [XXX] MSG",
"host" => "myhost",
"tags" => [
[0] "_dissectfailure",
[1] "_dateparsefailure"
]
}
I tried to remove [@metadata ], but the problem persists. Another question, why use [@metadata ] before the name/alias of the [timestamp] field?
My test line:
echo "[14/01/2020 16:03:00,143 XXX] [XXX] MSG" | bin/logstash -e 'input { stdin {} } filter { dissect { mapping => { "message" => "[%{[@metadata][timestamp]} %{+%{[@metadata][timestamp]} %{}" } } date { match => [ "[@metadata][timestamp]", "dd/MM/yyyy HH:mm:ss,SSS" ] } }'
Grateful for the attention.
blenolopes
(Bleno Lopes)
January 15, 2020, 5:27pm
4
Hi Badger,
I managed to solve. Following solution. Thank you for your help.
dissect {
mapping => {
"message" => "[%{timestamp} %{+timestamp} %{}"
}
}
date {
match => [ "timestamp", "dd/MM/yyyy HH:mm:ss,SSS" ]
target => "@timestamp"
}
Badger
January 15, 2020, 5:38pm
5
Yeah, I had an extract %{ in there. It should have been
dissect { mapping => { "message" => "[%{[@metadata][timestamp]} %{+[@metadata][timestamp]} %{}" } }
I use [@metadata ] since that can be referenced in the filters, but it not included in the output. It probably does not add any value to have a [timestamp] field in your final documents.
1 Like
system
(system)
Closed
February 12, 2020, 5:38pm
6
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.