Date parsed but not properly indexed

Hello,

Going through logstash 6.2.3, I'm parsing some JSON from a message log and then from this json, I'm parsing the DATETIME.

When the json arrived, I've a field like this:

"image_time_stamp": "2018-05-02-15:43:30.672"

and once it has been parsed, from kibana:

image_time_stamp 2018-05-02T15:43:30.672Z

So I'm super happy about it, except that when I'm checking the index, I see that "image_time_stamp" is still a string.

I need to have this one a couple other to be sorted as datetime, since I'm trying to monitor them through graphs.

Here is the filter process once the json has been extracted through grok:

  json {
    source => "data_json"
  }
  date {
    match => ["time_stamp_desc", "yyyy-MM-dd-HH:mm:ss.SSSSSS"]
    timezone => "UTC"
    target => "time_stamp_desc"
  }
  date {
    match => ["image_time_stamp", "yyyy-MM-dd-HH:mm:ss.SSSSSS"]
    timezone => "UTC"
    target => "image_time_stamp"
  }
  date {
    match => ["time_stamp_roi", "yyyy-MM-dd-HH:mm:ss.SSSSSS"]
    timezone => "UTC"
    target => "time_stamp_roi"
  } 

Any idea how to make sure that there index will be a date?

Many thanks!

Ps: to avoid any confusion, yes I refreshed the index within kibana :wink:

Hehehe, actually we can close this subject.
I put a new target and realised that the new target is saved as a date.

I'm just wondering if there is an other option for me, that would allow me not to delete the entire index, but that's an other topic.

Sorry for the people whom will spend time reading it!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.