_dateparseerror for timestamp in this format: 2023-01-11T05:07:30.648881Z,

I am trying to parse a field called create_ts with a value of 2023-01-11T05:07:30.648881Z and then add fields for year, month and day.

I have scoured everything to figure out the correct timestamp pattern and cannot seem to make it work.

I am using this filter:

filter {
   date { match => ["create_ts", "YYYY-MM-dd'T'HH:mm:ss.SSSSSSZ"]
        add_field => {"year" => "%{+YYYY}" }
        add_field => {"month" => "%{+MM}" }
        add_field => {"day" => "%{+dd}" }

but I get a _dateparseerror. I am sure I am doing something stupid but I am not sure what.

Please, share the full log error you are getting and a sample document.

This is a sample document:

This is the what I get from my stdout plugin:

       "@timestamp" => 2023-01-11T16:59:27.379431900Z,
         "@version" => "1",
    "create_yyyymm" => 202301,
        "create_ts" => 2023-01-09T05:13:21.411713Z,
             "tags" => [
        [0] "_dateparsefailure"
    "create_user_i" => "EMTREDIP"

I can covert with your data& code on v8.5.3 without any problem.

   date { match => ["create_ts", "ISO8601"]
        add_field => {"year" => "%{+YYYY}" 
		"month" => "%{+MM}" 
		"day" => "%{+dd}" }

create_ts is not a string, it is a LogStash::Timestamp (there are no double quotes around the value). A date filter cannot parse that. Use mutate+convert to make it a string before the date filter.

Thanks @Rios! It works with ISO8601.

@Badger , the JSON is created by the jdbc input plugin so I am not actually creating it. So the jdbc input plugin creates the create_ts value as a string. But I do see that it looks like logstash changed it to a non-string timestamp. I am a little confused. Changing the timestamp pattern from what I had to ISO8601 made it work though without doing a mutate+convert. Being pretty much a logstash newbie I am not sure why.

Badger, I think you have good point here. Might be sometimes create_ts come as string and sometimes like date type. If this is import from Excel or csv then is possible.

@rickfish check types in Elasticsearch which are with tag _dateparseerror, filter in Kibana.
Btway, your date { match => ["create_ts", "YYYY-MM-dd'T'HH:mm:ss.SSSSSSZ"] is working fine on mine side for "create_ts":"2023-01-09T05:13:21.411713Z". There is no explicitly mentioned which type should be 1st field, assume it's the string type.


  • Value type is array
  • Default value is []

An array with field name first, and format patterns following, [ field, formats... ]

@Rios, @Badger, long story. I guess the jdbc input plugin does create a timestamp column value as a timestamp in the json, not a string as I thought. I thought it was a string because the json created in my s3 output plugin converted the timestamp to a string. Since I couldn't see what was created by the jdbc input plugin, I assumed it looked exactly like what was put into the s3 file since I had no filter, just input and output.

Anyway, in order to test it without the jdbc and s3 plugins, I created a file with the line from the s3 file and used a file input plugin to test. But at the same time I changed my date pattern to ISO8601. My test worked so I thought it was the pattern.

When I switched back to the jdbc input, it failed again.

So I added the mutate/convert filter as suggested by @Badger and it now works like a champ.

Sorry for the long explanation, especially if it is convoluted.

Thanks again so much for the help.

It's not long, actually is useful when you have feedback for a suggestion.

Yes, it does.