Unable to convert field to date type with date filter

Hi,

I am unable to convert a date field to its type to date with date filter. Here's what I wrote for achieving that:

    mutate {
      add_field => {
        "end_date_parse" => "%{end_date} %{zimbra_proc_end_time}"
        "start_date_parse" => "%{start_date} %{zimbra_proc_start_time}"
      }
    }

    date {
      match => [ "start_date_parse", "HH:mm:ss", "dd MMM yyyy HH:mm:ss.SSSZ", "d MMM yyyy HH:mm:ss.SSSZ" ]
      target => "zimbra_proc_start_time"
    }

    date {
      match => [ "end_date_parse", "HH:mm:ss", "dd MMM yyyy HH:mm:ss.SSSZ", "d MMM yyyy HH:mm:ss.SSSZ" ]
      target => "zimbra_proc_end_time"
    }

The format for start date is: 16 Jun 2018 and for zimbra_proc_start_time is: HH:mm:ss.SSSZ

And still the target fields are being reported as keyword and not date

What do the end_date, start_date, end_date_parse, start_date_parse, zimbra_proc_start_time, and zimbra_proc_end_time fields actually look like? Please post either the output from

output { stdout { codec => rubydebug } }

in logstash, or, if you are using Kibana, copy and paste from the JSON tab in Discover.

@badger Now, the date is not even getting calculated correctly. I'll add another question with json for this issue and then we can look for the date field type conversion. But, just for the reference I'll paste the json here:

{
  "_index": "filebeat-6.2.4-2018.06.22-index",
  "_type": "doc",
  "_id": "AxpzJ2QB3tEACYuyWrVz",
  "_version": 2,
  "_score": null,
  "_source": {
    "zimbra_proc_end_time": "2018-01-01T09:32:46.000Z",
    "tags": [
      "umm-logs-zimbra-proc-failure",
      "beats_input_codec_plain_applied"
    ],
    "source": "/opt/openvault/umm/logs/ummProcess.log",
    "host": "ip-192-168-0-139",
    "offset": 308705,
    "log_level": "INFO",
    "start_date": "17 Apr 2018",
    "prospector": {
      "type": "log"
    },
    "day": "Tue",
    "end_date": "17 Apr 2018",
    "@timestamp": "2018-06-22T12:23:44.474Z",
    "zimbra_proc_start_time": "2018-01-01T09:32:45.000Z",
    "beat": {
      "name": "ip-192-168-0-139",
      "hostname": "ip-192-168-0-139",
      "version": "6.2.4"
    },
    "@version": "1",
    "message": "17 Apr 2018 09:32:45\tINFO\tZimbra notification file generation started  at: Tue Apr 17 09:32:45 CDT 2018\n17 Apr 2018 09:32:45\tINFO\t  Executing Zimbra file generation process\nWarning: Using a password on the command line interface can be insecure.\n17 Apr 2018 09:32:46\tINFO\tFinished at: Tue Apr 17 09:32:46 CDT 2018"
  },
  "fields": {
    "@timestamp": [
      "2018-06-22T12:23:44.474Z"
    ],
    "zimbra_proc_time": [
      1
    ],
    "zimbra_proc_end_time": [
      "2018-01-01T09:32:46.000Z"
    ],
    "csg_proc_time": [
      ""
    ],
    "perftech_proc_time": [
      ""
    ],
    "zimbra_proc_start_time": [
      "2018-01-01T09:32:45.000Z"
    ]
  },
  "sort": [
    1514799165000
  ]
}

As you could see, the date calculation absolutely doesn't make any sense at all here which is strange.

The only change I did was to add 'T':

    date {
      match => [ "start_date_parse", "yyyy-MM-dd'T'HH:mm:ss.SSSZ", "HH:mm:ss", "dd MMM yyyy HH:mm:ss.SSSZ", "d MMM yyyy HH:mm:ss.SSSZ" ]
      target => "zimbra_proc_start_time"
    }

    date {
      match => [ "end_date_parse", "yyyy-MM-dd'T'HH:mm:ss.SSSZ", "HH:mm:ss", "dd MMM yyyy HH:mm:ss.SSSZ", "d MMM yyyy HH:mm:ss.SSSZ" ]
      target => "zimbra_proc_end_time"
    }

And time format is: HH:mm:ss

zimbra_proc_start_time: 09:40:41

Why doesn't the document contain the start_date_parse and end_date_parse fields that you are feeding into the date filter? If you are deleting those fields before you are certain they are being parsed correctly, well, don't do that.

Given that your document does not contain a _dateparsefailure tag it is possible that the date filter is parsing the fields correctly, and this is really an elasticsearch indexing question, not a logstash question.

Yes, I am removing them. Reason being, a couple of days back this was very much working perfectly. Except that the field type was being reported as keyword and not date.

After that, I made some changes, and now everything's messed up as you can see. I can post this question in elasticsearch.

@Badger I am seeing this in logstash logs:

[2018-06-22T17:40:54,132][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"filebeat-6.2.4-2018.06.22", :_type=>"doc", :_routing=>nil}, #<LogStash::Event:0x12b3fb04>], :response=>{"index"=>{"_index"=>"filebeat-6.2.4-2018.06.22", "_type"=>"doc", "_id"=>"5huVKGQB3tEACYuyphjR", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [zimbra_proc_start_time]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: \"04:45:31\" is malformed at \":45:31\""}}}}}

Does that mean there is some parsing issue?

Yeah, logstash is getting an error from elasticsearch when it tries to index the document. I'm not sure exactly what it is objecting to. The elasticsearch log might have a better error message.

This is what ES logs say:

Caused by: java.lang.IllegalArgumentException: Invalid format: "04:56:32" is malformed at ":56:32"
        at org.joda.time.format.DateTimeParserBucket.doParseMillis(DateTimeParserBucket.java:187) ~[joda-time-2.9.9.jar:2.9.9]
        at org.joda.time.format.DateTimeFormatter.parseMillis(DateTimeFormatter.java:826) ~[joda-time-2.9.9.jar:2.9.9]
        at org.elasticsearch.index.mapper.DateFieldMapper$DateFieldType.parse(DateFieldMapper.java:248) ~[elasticsearch-6.2.4.jar:6.2.4]
        at org.elasticsearch.index.mapper.DateFieldMapper.parseCreateField(DateFieldMapper.java:456) ~[elasticsearch-6.2.4.jar:6.2.4]
        at org.elasticsearch.index.mapper.FieldMapper.parse(FieldMapper.java:297) ~[elasticsearch-6.2.4.jar:6.2.4]
        ... 57 more

Something related to millis? The format of the field when I GET request the index is yyyy-MM-dd'T'HH:mm:ss.SSSZ

Solved:
This was because I added millis to the parsing SSSZ which was not part of the time format. Thanks @Badger for the help.

I am still not able to convert the field to date type. It is still getting reported as keyword after using date filter. Any inputs? No _dateparsefailure in sight. @magnusbaeck It would nice if you can point out something in this case as date parsing is working just fine.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.