JSON timestamp coming as text

Basically I am not getting a @timestamp field that is a date - its coming through as text.

#this is my .conf file (with some stuff excluded for security)

input{
  beats{
     port => 5048
  }
}

filter {
  json {
     source => "message"
}
  date {
    match => [ "timestamp", "YYYY-MM-ddTHH:mm:ss,sss+hhhh" ] }
}

output {
      elasticsearch  - #this is excluded as i know this works.

and my log format looks like

{"type":"audit", "timestamp":"2023-09-07T14:34:58,359+0100", "node.id":"MFOk8jclQlW3-fVz7xHghQ", "event.type":"transport", "event.action":"access_granted", "authentication.type":"REALM", "user.name":"kibana_system", "user.realm":"reserved", "user.roles":["kibana_system"], "origin.type":"transport", "origin.address":"*****", "request.id":"5f7V2f5HQWeMIdi3B1qcZQ", "action":"cluster:monitor/nodes/info[n]", "request.name":"NodeInfoRequest"}

I cannot understand why its not becoming a searchable timestamp field.

Please help

You mean the timestamp field, not the @timestamp right?

The @timestamp field coming from Logstash will always be a date field, even if you do not have a date filter or if the date filter fails.

  date {
    match => [ "timestamp", "YYYY-MM-ddTHH:mm:ss,sss+hhhh" ] 
  }

This filter will parse the timestamp field from the date string provided and save it as @timestamp, which is the default target, the original timestamp field will not be changed and if Elastiscearch does not recognized it as a date string, which can happen sometimes, it will be stored as a string.

If you want to make sure that the timestamp field is saved as a date in elasticsearch, you need to have another date filter where you will set it as a target.

  date {
    match => [ "timestamp", "YYYY-MM-ddTHH:mm:ss,sss+hhhh" ] 
    target => "timestamp"
  }

This will convert the timestamp field from a string to a date field.

You may need to recreate your destination index as this will change the mapping of the field.

So basically, when i create the index pattern, it tells me that the @timestamp field is text as is Time. Because this is missing that index pattern isn't viewable in discover.

Did you create a template for your index? The mappings seems to be wrong.

You would need to create an index template with the correct mapping.

1 Like

i do have an index template with the following mapping

{
  "template": {
    "settings": {
      "index": {
        "number_of_shards": "1",
        "number_of_replicas": "1"
      }
    },
    "mappings": {
      "dynamic": "true",
      "dynamic_date_formats": [
        "strict_date_optional_time",
        "yyyy/MM/dd HH:mm:ss Z||yyyy/MM/dd Z"
      ],
      "dynamic_templates": [],
      "date_detection": true,
      "numeric_detection": false
    },
    "aliases": {}
  }
}

I am pretty new to all this so really appreciate your help

I don't think this will match the format of the @timestamp field generated by Logstash, that would explain why it was mapped as a text.

I would suggest that you remove this dynamic_date_formats, since this is not needed in most cases and could be seen as a more advanced setting that you do not want when you are starting using Elastic.

Just change your template into this:

{
  "template": {
    "settings": {
      "index": {
        "number_of_shards": "1",
        "number_of_replicas": "1"
      }
    },
    "mappings": {
      "dynamic": "true",
      "date_detection": true,
      "numeric_detection": false
    },
    "aliases": {}
  }
}

This will make elasticsearch automatically map the date fields coming from Logstash as date.

I think that should be "YYYY-MM-dd'T'HH:mm:ss,SSS+hhhh", or perhaps "YYYY-MM-dd'T'HH:mm:ss,SSSZ"

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.