Log timestamp is not getting through date filter getting date_time_parse_exception

Hi Team,

I have following logstash configuration for where I have written the date pattern to parse my timestamp, but time is not working. I'm getting _dateparsefailure.
here is the configuration.

input{.............}
filter {
..............
.........
         if[time] {
			date {
                match => ["time", "yyyy-MM-dd HH:mm:ss,SSS", "yyyy-MM-dd'T'HH:mm:ss,SSS", "yyyy-MM-dd HH:mm:ss", "yyyy-MM-dd HH:mm:ss ,SSS"]
                target => "@timestamp"
                locale => "en"
                timezone => "+04:00"
            }
            mutate  {
                add_field => { "log" => "%{time} %{message}" }
                remove_field => ["kafkatime","message1","message"]
            }
        }
        else  {
            mutate  {
                add_field => { "log" => "%{message}" }
                remove_field => ["kafkatime","message1","message","time"]
            }
        }
...................
..............
output {.................}

This is my time coming in logs

"log"=>"2022-07-15 09:20:20,157 | tid:0ofq198rsnX3ziJ4Q92xsNZ-b2M|

Below is my failure message:

"error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [time] of type [date] in document with id 'obBnAIIBgI2nnIQahxqE'. Preview of field's value: '2022-07-15 09:20:20,157'", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"failed to parse date field [2022-07-15 09:20:20,157] with format [strict_date_optional_time||epoch_millis]", "caused_by"=>{"type"=>"date_time_parse_exception", "reason"=>"date_time_parse_exception: Failed to parse with all enclosed parsers"}}}}}}

Kindly help me to troubleshoot this.

Thanks,
Tahseen

I don't see any issues in the date format except the time zone. Change Europe/Berlin to your from the list.

date {
                match => ["time", "yyyy-MM-dd HH:mm:ss,SSS", "yyyy-MM-dd'T'HH:mm:ss,SSS", "yyyy-MM-dd HH:mm:ss", "yyyy-MM-dd HH:mm:ss ,SSS"]
                target => "@timestamp"
                locale => "en"
                timezone => "Europe/Berlin"
            }

Hi @Rios ,

Yes, the pattern is right because many events are getting parsed and ingesting in elasticsearch, but few of the events are creating problem and giving error as "could not indexed data into elasticsearch because of date failure"
The time zone you are suggesting me to use "timezone => "Europe/Berlin" is of +01:00 this, and it is different than mine.

So I can't use the above one you suggested. it will change the timezone. but I'll try to use mine "Asia timezone instead of writing +04:00.

Thanks,
Tahseen

Hi @Rios ,

I tried using the timezone, but didn't worked it out.
Please help.

Thanks,
Tahseen

"error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [time] of type [date] in document with id 'obBnAIIBgI2nnIQahxqE'. Preview of field's value: '2022-07-15 09:20:20,157'", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"failed to parse date field [2022-07-15 09:20:20,157] with format [strict_date_optional_time||epoch_millis]", "caused_by"=>{"type"=>"date_time_parse_exception", "reason"=>"date_time_parse_exception: Failed to parse with all enclosed parsers"}}}}}}

The above error indicates that you are trying to index the time field. If you don't need this field, which I believe is redundant, you should remove the time field as you have used the date filter to store the timestamp in the @timestamp field.

Hi @hendry.lim ,

As if you look at my parser I am removing the time filed when there is no time in logs, and adding the time filed with message and creating a filed name "log".
I don't understand why my pattern is not matching with the messages, or may be
the logs itself not going through my date pattern and throwing such errors.
What is blocking here not able to understand.

For the timezone I gave you the example, use timezone => "Asia/Dubai" for +4h.
I have parsed value "2022-07-10 09:20:20,157" with your date code without any problems.

Can you check which fields cause a problem? Should be visible in Elasticsearch with tags: _dateparsefailure
Is it possible the "time" field sometimes missing or other nonstandard values?
What a type is the "time" field in Kibana? String or date?
If is possible, I would temporary use additional index and would not delete the message filed until this solve.

See again, you removed kafkatime not time.

1 Like

Hi @Rios ,

The time filed in kibana is of date type.
I have also tried in local with my parser its working fine, but in live logs its throwing error. I have also tried with timezone => "Asia/Dubai" timezone but didn't worked.

Something is blocking but could not able to identify.

Hi @hendry.lim

You can see it has both the condition when time is there in message add field , and when time is not there in message remove time filed.
but some of the events are not going through with this pattern.

So if you have time, you are not removing time and that's where the problem is with the time field. In fact, the removal of time if it does not exist is a dead code.

Hi @hendry.lim ,

I tried with removing the time field, but data stop coming to es.

Any errors in Logstash or Elasticsearch?

Can you share the full log error? You shared just part of it, which can be misleading.

The error you shared is normally throw as a WARN when you have some mapping issues when trying to index into Elasticsearch, it has a return code of 400, it is not an Logstash issue, but a problem with your mapping in Elasticsearch.

The mapping for the field time probably does not accept the value 2022-07-15 09:20:20,157, even if it is recognized as a date.

Share the mapping of this field in your index, use the following request for it.

GET your-index/_mapping/field/time

Also, try to add a second date filter after the one you have and use the time field as a target as well, see if this solve the issue:

date {
    match => ["time", "yyyy-MM-dd HH:mm:ss,SSS", "yyyy-MM-dd'T'HH:mm:ss,SSS", "yyyy-MM-dd HH:mm:ss", "yyyy-MM-dd HH:mm:ss ,SSS"]
    target => "@timestamp"
    locale => "en"
    timezone => "+04:00"
}
date {
    match => ["time", "yyyy-MM-dd HH:mm:ss,SSS", "yyyy-MM-dd'T'HH:mm:ss,SSS", "yyyy-MM-dd HH:mm:ss", "yyyy-MM-dd HH:mm:ss ,SSS"]
    target => "time"
    locale => "en"
    timezone => "+04:00"
}

Hi @leandrojmp ,

Here is my mapping of time field.

{
  "my-index-2022.07.10-000007" : {
    "mappings" : {
      "time" : {
        "full_name" : "time",
        "mapping" : {
          "time" : {
            "type" : "date"
          }
        }
      }
    }
  },
  "my-index-2022.07.07-000006" : {
    "mappings" : {
      "time" : {
        "full_name" : "time",
        "mapping" : {
          "time" : {
            "type" : "date"
          }
        }
      }
    }
  },
  "my-index-2022.07.12-000008" : {
    "mappings" : {
      "time" : {
        "full_name" : "time",
        "mapping" : {
          "time" : {
            "type" : "date"
          }
        }
      }
    }
  },
  "my-index-2022.07.14-000009" : {
    "mappings" : {
      "time" : {
        "full_name" : "time",
        "mapping" : {
          "time" : {
            "type" : "date"
          }
        }
      }
    }
  },
  "my-index-2022.07.15-000010" : {
    "mappings" : {
      "time" : {
        "full_name" : "time",
        "mapping" : {
          "time" : {
            "type" : "date"
          }
        }
      }
    }
  }
}

The suggestion you have given to me target => "time" but I want the target should be @timestamp not time.

Below is the logstash output.

Jul 16 14:27:19 logstash[93697]: [2022-07-16T14:27:19,957][WARN ][logstash.outputs.elasticsearch][myindex][output] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"my-index-new1", :routing=>nil}, {"host.os.family"=>"*****", "source.ip"=>"*******", "host.id"=>"8ef88257d7814553be3cd5b4afc07d0a", "host.architecture"=>"x86_64", "agent.version"=>"Fluent Bit v1.8.11", "entity"=>"xyz", "host.os.platform"=>"rhel", "@timestamp"=>2022-07-15T15:58:11.000Z, "tailed_path"=>"/xyz/qwe-9.1.3/xyz/log/audit.log", "host.name"=>"*******", "component"=>"my-index-auditlog", "host.os.name"=>"Red Hat Enterprise Linux Server", "agent.hostname"=>"*******", "host.ip"=>"*********", "@version"=>"1", "log"=>"2022-07-15 19:58:11 ,803| tid:sQhsGRmRXUAFLqHVgCKcZBF2V5k| OAuth| QuhR| 10.191.9.49 | | finnone_client| OIDC| AS| success| | | 7 ", "host.os.kernel"=>"******1.el7.x86_64", "host.os.version"=>"7.9 (Maipo)", "type"=>"my-index", "agent.type"=>"fluent-bit", "time"=>"2022-07-15 19:58:11", "host.hostname"=>"*********"}], :response=>{"index"=>{"_index"=>"my-index-new1-2022.07.15-000010", "_type"=>"_doc", "_id"=>"mamMBoIBgI2nnIQaM9uU", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [time] of type [date] in document with id 'mamMBoIBgI2nnIQaM9uU'. Preview of field's value: '2022-07-15 19:58:11'", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"failed to parse date field [2022-07-15 19:58:11] with format [strict_date_optional_time||epoch_millis]", "caused_by"=>{"type"=>"date_time_parse_exception", "reason"=>"date_time_parse_exception: Failed to parse with all enclosed parsers"}}}}}}

Thanks,
Tahseen

The issue is what I said in the previous post, your time field is not being recognized as a date time field by elasticsearch, the format of date string does not match the default mapping of the date field which is strict_date_optional_time||epoch_millis.

This expects the date string to be in the ISO 8601 format or in epoch time, the date string of your time field does not match none of these, so elasticsearch will reject the document.

The easiest solution is also what I said in the previous post, apply another date filter using the time field as a target, put this after the date filter where the target is the @timestamp field, you can have both filters.

You can replicate this issue pretty easy with the following steps:

PUT date-issue

PUT /date-issue/_mapping
{
  "properties": {
    "time": {
      "type": "date"
    }
  }
}

This will create an index with a mapping for the time field as the type of date.

Now if you try to add a document where the time field has the value 2022-07-15 19:58:11, you will get the same error.

POST date-issue/_doc/1
{
  "@timestamp": "2099-11-15T13:00:00",
  "time": "2022-07-15 19:58:11"
}

This is the response:

{
  "error" : {
    "root_cause" : [
      {
        "type" : "mapper_parsing_exception",
        "reason" : "failed to parse field [time] of type [date] in document with id '1'. Preview of field's value: '2022-07-15 19:58:11'"
      }
    ],
    "type" : "mapper_parsing_exception",
    "reason" : "failed to parse field [time] of type [date] in document with id '1'. Preview of field's value: '2022-07-15 19:58:11'",
    "caused_by" : {
      "type" : "illegal_argument_exception",
      "reason" : "failed to parse date field [2022-07-15 19:58:11] with format [strict_date_optional_time||epoch_millis]",
      "caused_by" : {
        "type" : "date_time_parse_exception",
        "reason" : "Failed to parse with all enclosed parsers"
      }
    }
  },
  "status" : 400
}

Now, if you correct the value of the time field to be one in a ISO 8601 format, it will work:

POST date-issue/_doc/1
{
  "@timestamp": "2099-11-15T13:00:00",
  "time": "2022-07-15T19:58:11"
}

To solve this you have two ways, the first one was already mentioned, just add another date filter using time as the target.

The other solution would be changing your mapping and adding a format that would also match your field.

For example:

PUT /date-issue/_mapping
{
  "properties": {
    "time": {
      "type": "date",
      "format": "strict_date_optional_time||epoch_millis||yyyy-MM-dd HH:mm:ss"
    }
  }
}
1 Like

Hi @leandrojmp ,

Let me try your suggestions.

Hi @leandrojmp ,

I tried with the above suggestion, added the two date filter one with target => "@timestamp" and other with target => "time" it worked out, because creating a new template will require me to create new index as well.
Thank you very much for taking the time to look into my case and give me the solution.

Thank you,
Tahseen

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.