Logstash failing with dissect when adding one field with a date

Im using Logstash 7.2.0 that is being fed logs through filebeats. I have done a decent amount of work with parsing some of the logs coming in, but seem to keep running into this recuring issue.
Sending a log like this:
10.0.0.248 - - [22/Oct/2019:16:14:22 +0100] 005_D34DB33F "POST /uri/uri/Wibble HTTP/1.1" 200 614994 1240 "-" "Another string here 10h"

Using dissect, this works as expected:

filter {
dissect {
mapping => {
"message" => '%{IP} %{user} - [%{}] %{OtherThings}'
}
}
}

However, as soon as I change it to this, i stop seeing the logs entirely in elasticsearch. They arent even there with _dissectfailure tag or anything:

filter {
dissect {
mapping => {
"message" => '%{IP} %{user} - [%{logDateTime}] %{OtherThings}'
}
}
}

Where am I going wrong here?

That second one works just fine for me, which suggests that you are changing something else as well. Is elasticsearch logging any errors?

@Badger Elasticsearch is logging so many errors from other logs that are failing to parse (unrelated) that it impossible to locate the logs for this particular rule. Im super inclined to think that this is failing when being added to ElasticSearch also, but cant quite put my finger on why.

As an update, If I change the name of the %{logDateTime} field to anything else that doesnt already exist in another rule, then It parses the second rule just fine!

It could be getting 400 errors for a mapping exception. That is, the type of logDateTime has already been set by another rule. Maybe one is a LogStash::Timestamp and the other is a string? Can you grep for logDateTime in the elasticsearch log?

So I have already been looking at the mappings, and it looks like that is indeed the problem. logDateTime is type : date, and the value I am trying to add is not a propper date as far as Elasticsearch is concerned.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.