Date Mapping Template Not Being Applied Correctly

Hi,
I'm receiving the following data set into ES from Logstash:

{
  "fields": {
    "date-code": "180502",
    "form-factor": "FORM1",
    "serial-no": "1122334455",
    "vendor": "VENDOR-PRE",
    "vendor-part": "PARTNUMBER001",
    "vendor-rev": "01"
  },
  "name": "optics",
  "tags": {
    "device": "router1.mgt.net",
  },
  "timestamp": 1702660837
}

The date-code should convert over to May 2, 2018 since I'v configured the format in template to be yyMMdd, however, its converting to May 1, 2018 @ 20:00:00.000. Its off by one day.

This is how I have mapping configured in Kibana:

"date-code": {
          "type": "date",
          "format": "yyMMdd",
          "ignore_malformed": true
        }

Is this possibly a timezone issue?
How can I correct this?

Hi @mohsin106

You will need to provide a timezone that is most likely being interpreted as UTC that is why the time is "apparently" off

And that looks like off by 4 hours which I would guess is probably related to your timezone...

All Dates are stored in UTC and Displayed in Kibana in your local timezone.

You can look at the RAW json...

You

How would I provide a timezone? Do I do this in Logstash or Kibana?

I have verified that I'm receiving date-code in the following two separate formats:

  1. 180502
  2. 2023-02-22T00:00:00Z00:00

I'm also getting these warnings in Logstash and this data is not being written to ES:

[WARN ] 2023-12-16 00:34:50.570 [[lab-optics-pipeline]>worker0] elasticsearch - Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"ptx-optics-2023.12.16", :routing=>nil}, {"host"=>"lab-ptx-jti-deployment-596b65b758-hmrsk-telegraf-agent", "form-factor"=>"222222", "serial-no"=>"1111111111", "topic_name"=>"backbone-clean-json-optics-lab", "vendor-part"=>"333333", "vendor"=>"CISCO-ADDON     ", "timestamp"=>1702686889, "partition"=>"0", "device"=>"router.mgt.net", "date-code"=>"2023-02-22T00:00:00Z00:00", "@version"=>"1", "name"=>"optics", "@timestamp"=>2023-12-16T00:34:50.390Z, "vendor-rev"=>"D2", "optic-port"=>"4444", "system-id"=>"router.DUKE.AT"}], :response=>{"index"=>{"_index"=>"optics-2023.12.16", "_id"=>"KcIMcIwBx7hn7-75GvcY", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [date-code] of type [long] in document with id 'KcIMcIwBx7hn7-75GvcY'. Preview of field's value: '2023-02-22T00:00:00Z00:00'", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"For input string: \"2023-02-22T00:00:00Z00:00\""}}}}}

I need to understand how to properly apply templates in Kibana or convert the date properly in Logstash so that both date formats can be written properly in ES. But I don't understand how to do this.

Hi @mohsin106

So you should also verify that both dates are UTC or not ...
the 2nd format implies UTC (although it is a non-standard format).
Is the 1st supposed to be UTC? or have some timezone offset?

You need to understand/confirm that ... then we can make a plan.

Also, another question: are you

a) parsing those out of a log line in Logstash,
b) or are you receiving JSON in Logstash...

please share that logstash pipeline config and sample data as well.

In short you need to understand what you are getting.
Then Parse them if needed
The Mapping must support the proper Format in order to store the date field as when then as explained here

Internally, dates are converted to UTC (if the time-zone is specified) and stored as a long number representing milliseconds-since-the-epoch.

So the date formats must match what is coming in to be parsed to epoch millis and stored... basically the error above says that did not / could not happen.

And for Clariity the mapping is the Elasticsearch Mapping not Kibana :slight_smile:

Here are the builtin formats

Here are the custom formats in detail

I have posed the question about timezone to the vendor and awaiting a response if 180502 is UTC or not.

Let's assume it is UTC, how do we move forward and get it logged in ES correctly? How do I handle both date formats considering the second format is a non-standard format?

I am receiving JSON in Logstash.

My Logstash config:

input {
    kafka {
      client_id => "lab-optics"
      topics => ["backbone-clean-json-optics-lab"]
      group_id => "lab-optics-v1"
      bootstrap_servers => "server.iaas.net:9093"
      consumer_threads => 1
      codec => "json"
      security_protocol => "SSL"
      ssl_keystore_location => "/usr/share/logstash/config/myjksfile.jks"
      ssl_keystore_password => "password"
      ssl_keystore_type => PKCS12
      ssl_truststore_location => "/usr/share/logstash/config/ca.jks"
      decorate_events => true
    }
}

filter { 
    json { 
        source => "message" 
        remove_field => [ "message" ] 
    }

    if [fields] {
        ruby {
        code => '
            event.get("fields").each { |k, v|
            event.set(k,v)
            }
            event.remove("fields")
        '
        }
    }
    
    if [tags] {
        ruby {
            code => '
            event.get("tags").each { |k, v|
                event.set(k,v)
            }
                event.remove("tags")
            '
        }
    }

    # date {
    #     match => ["date-code", "yyMMdd"]
    #     target => "converted_date"
    # }

}

output {
    elasticsearch {
        index => "optics-%{+YYYY.MM.dd}"
        hosts => "https://es.mgt.net:443"
        user => "username"
        password => "password"
    }
}

First that is not really a valid format.

the Z means UTC
Adding the 00:00 does not add precision AND it then does not fit a format and I can not figure out how to make it fit using all the tricks / formats I have tried; basically it is a bad format

These are valid formats

2023-02-22T00:00:00Z
2023-02-22T00:00:00+00:00

So if you can fix that (in logstash I think a simple gsub to replace the Z with a + will work then you just set your mapping to the following

PUT discuss-test-time
{
  "mappings": {
    "properties": {
      "date-code": {
        "type": "date",
        "format": "strict_date_optional_time||yyMMdd"
      }
    }
  }
}

# Works
POST discuss-test-time/_doc
{
  "date-code" : "180502"
}

# Works 
POST discuss-test-time/_doc
{
  "date-code" : "2023-02-22T00:00:00+00:00"
}


# Works 
POST discuss-test-time/_doc
{
  "date-code" : "2023-02-22T00:00:00Z"
}

# Does Not Work
POST discuss-test-time/_doc
{
  "date-code" : "2023-02-22T00:00:00Z00:00"
}

# POST discuss-test-time/_doc 201 Created
{
  "_index": "discuss-test-time",
  "_id": "oIF3hIwBZMvS5ljkOQn1",
  "_version": 1,
  "result": "created",
  "_shards": {
    "total": 2,
    "successful": 1,
    "failed": 0
  },
  "_seq_no": 0,
  "_primary_term": 1
}
# POST discuss-test-time/_doc 201 Created
{
  "_index": "discuss-test-time",
  "_id": "oYF3hIwBZMvS5ljkOgkD",
  "_version": 1,
  "result": "created",
  "_shards": {
    "total": 2,
    "successful": 1,
    "failed": 0
  },
  "_seq_no": 1,
  "_primary_term": 1
}
# POST discuss-test-time/_doc 201 Created
{
  "_index": "discuss-test-time",
  "_id": "ooF3hIwBZMvS5ljkOgkN",
  "_version": 1,
  "result": "created",
  "_shards": {
    "total": 2,
    "successful": 1,
    "failed": 0
  },
  "_seq_no": 2,
  "_primary_term": 1
}
# POST discuss-test-time/_doc 400 Bad Request
{
  "error": {
    "root_cause": [
      {
        "type": "document_parsing_exception",
        "reason": "[2:17] failed to parse field [date-code] of type [date] in document with id 'o4F3hIwBZMvS5ljkOgkb'. Preview of field's value: '2023-02-22T00:00:00Z00:00'"
      }
    ],
    "type": "document_parsing_exception",
    "reason": "[2:17] failed to parse field [date-code] of type [date] in document with id 'o4F3hIwBZMvS5ljkOgkb'. Preview of field's value: '2023-02-22T00:00:00Z00:00'",
    "caused_by": {
      "type": "illegal_argument_exception",
      "reason": "failed to parse date field [2023-02-22T00:00:00Z00:00] with format [strict_date_optional_time||yyMMdd]",
      "caused_by": {
        "type": "date_time_parse_exception",
        "reason": "Failed to parse with all enclosed parsers"
      }
    }
  },
  "status": 400
}
1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.