Issue in building visualization using date fields

Hi All,

I have ingested few data into elasticsearch and it contains few date fields as well like sys_created_on
sys_start_time
sys_end_time
sys_closed_time

While indexing i have indexed the data using @timestamp.

Now while trying to create a line chart using sys_created_on and showcase a timeline kindoff chart, but i'm getting no results when i choose any of those sys_ fields.. data is available only if i choose @timestamp.

Any advice on how to use those time date fields to showcase trend analysis.

Note : I have written a transformation in dev tools and getting all those sys_ fields as proper date fields only.

Thanks
Gautham

I have deleted my index and recreated the index sys_created_on after index creation i can see all the field values in index pattern, but when i check in discover tab it says nodata found.
I have extended my timelines to 10 years also to check the data, but still no data.

Any advice please??????

Thanks
Gautham

Hello @Gauti

  • Can you share the output of GET <index name>/_mapping in Kibana Dev Console or check if the fields you're mentioning are in date type?
  • If you've deleted the index and re-created it, to ensure any mapping change has been taken into account please go to Kibana / Management / Index Patterns and refresh the Index Pattern to force it to detect any changes in the types. You should see the fields with the correct type. If not, there's a problem with the mappings.
  • What field did you choose as time field in the index pattern?
  • To double check if the data is there, please go to Kibana Dev Console and execute:
    GET <index name>/_search
    {
      "size":0, "aggs": {
        "histo": { "date_histogram": { "field" : "<date field>", "calendar_interval" : "day" } }
      }
    }
    

Hi @Luca_Belluccini

I have verified the filed is in date format, below is the mapping details of that particular field

    "result.sys_created_on" : {
      "type" : "date",
      "ignore_malformed" : true
    },  

and here is the output of GET sampledata1/_search

I had choose @timestamp for index sampledata1
output for @timestamp

{
  "took" : 3,
  "timed_out" : false,
  "_shards" : {
    "total" : 1,
    "successful" : 1,
    "skipped" : 0,
    "failed" : 0
  },
  "hits" : {
    "total" : {
      "value" : 9804,
      "relation" : "eq"
    },
    "max_score" : null,
    "hits" : [ ]
  },
  "aggregations" : {
    "histo" : {
      "buckets" : [
        {
          "key_as_string" : "2020-05-14T00:00:00.000Z",
          "key" : 1589414400000,
          "doc_count" : 9804
        }
      ]
    }
  }
}

Also had created one more index with sampledata11 where i had choose result.sys_created_on
Output for result.sys_created_on

{
  "took" : 2,
  "timed_out" : false,
  "_shards" : {
    "total" : 1,
    "successful" : 1,
    "skipped" : 0,
    "failed" : 0
  },
  "hits" : {
    "total" : {
      "value" : 9804,
      "relation" : "eq"
    },
    "max_score" : null,
    "hits" : [ ]
  },
  "aggregations" : {
    "histo" : {
      "buckets" : [ ]
    }
  }
}

Thanks
Gautham

You have ignore_malformed: true on the field sys_created_on, which might ignore indexing errors.

My query shows the date buckets on a date field and as you can see it fails creating buckets on sys_created_on, meaning the date failed to be indexed (but you will probably find it in the _source.

Would it be possible to share some sample documents running GET <index name>/_search ?

@Luca_Belluccini

Below are the documents.

{
  "took" : 3,
  "timed_out" : false,
  "_shards" : {
    "total" : 1,
    "successful" : 1,
    "skipped" : 0,
    "failed" : 0
  },
  "hits" : {
    "total" : {
      "value" : 9804,
      "relation" : "eq"
    },
    "max_score" : 1.0,
    "hits" : [
      {
        "_index" : "sampledata11",
        "_type" : "_doc",
        "_id" : "INC222122",
        "_score" : 1.0,
        "_ignored" : [
          "result.sys_created_on",
          "result.planned_end_time",
          "result.start_time",
          "result.end_time",
          "result.original_breach_time",
          "result.pause_time",
          "result.sys_updated_on",
          "result.business_time_left"
        ],
        "_source" : {
          "@timestamp" : "2020-05-14T18:13:28.060Z",
          "@version" : "1",
          "result" : {
            "planned_end_time" : "11-26-2013 09:58:48",
            "has_breached" : "true",
            "active" : "false",
            "sys_created_by" : "user1",
            "sys_id" : "0234fsdd8406cd9406f05ba34324d",
            "sys_mod_count" : "64",
            "business_time_left" : "0 Seconds",
            "percentage" : "459.81",
            "pause_time" : "",
            "start_time" : "10-16-2013 16:17:48",
            "schedule" : "8-18 weekdays excluding holidays",
            "sys_updated_on" : "12-16-2013 08:45:20",
            "duration" : "22 Days 23 Hours 46 Minutes",
            "stage" : "Completed",
            "original_breach_time" : "",
            "time_left" : "0 Seconds",
            "sys_tags" : "",
            "sys_created_on" : "10-16-2013 16:17:49",
            "business_duration" : "7 Days 46 Minutes",
            "sys_updated_by" : "admin3",
            "end_time" : "12-16-2013 08:45:20",
            "timezone" : "America/New_York",
            "task" : "INC222122",
            "sla" : "Priority 3 resolution (3 day mon-fri)",
            "business_pause_duration" : "10 Days 13 Hours 41 Minutes",
            "pause_duration" : "37 Days 17 Hours 41 Minutes",
            "business_percentage" : "562.59"
          }
        }
      },
      {
        "_index" : "sampledata11",
        "_type" : "_doc",
        "_id" : "INC765063",
        "_score" : 1.0,
        "_ignored" : [
          "result.percentage",
          "result.business_percentage",
          "result.sys_created_on",
          "result.planned_end_time",
          "result.start_time",
          "result.end_time",
          "result.original_breach_time",
          "result.pause_time",
          "result.sys_updated_on",
          "result.business_time_left"
        ],
        "_source" : {
          "@timestamp" : "2020-05-14T19:13:28.749Z",
          "@version" : "1",
          "result" : {
            "planned_end_time" : "06-25-2018 09:26:05",
            "has_breached" : "true",
            "active" : "false",
            "sys_created_by" : "user5",
            "sys_id" : "0bsdkflsnlkef9db765723545b961983",
            "sys_mod_count" : "91",
            "business_time_left" : "0 Seconds",
            "percentage" : "1,139.76",
            "pause_time" : "",
            "start_time" : "06-20-2018 09:26:05",
            "schedule" : "8-18 weekdays excluding holidays",
            "sys_updated_on" : "08-16-2018 09:09:09",
            "duration" : "56 Days 23 Hours 43 Minutes",
            "stage" : "Completed",
            "original_breach_time" : "06-25-2018 09:26:05",
            "time_left" : "0 Seconds",
            "sys_tags" : "",
            "sys_created_on" : "06-20-2018 09:26:05",
            "business_duration" : "17 Days 1 Hour 43 Minutes",
            "sys_updated_by" : "admin1",
            "end_time" : "08-16-2018 09:09:09",
            "timezone" : "America/New_York",
            "task" : "INC765063",
            "sla" : "Priority 3 resolution (3 day mon-fri)",
            "business_pause_duration" : "",
            "pause_duration" : "",
            "business_percentage" : "1,365.73"
          }
        }
      }

Thanks
Gautham

The fields ended up in the _ignored, meaning the field couldn't be taken as date (doc).

This happens because the date format is not the default one.

To parse 06-25-2018 09:26:05 you need to provide:

  • MM-dd-uuuu HH:mm:ss if you're on Elasticsearch 7
  • 8MM-dd-uuuu HH:mm:ss if you're on Elasticsearch 6.8

This difference between version is because of the changes in the time fields on Elasticsearch (doc).

In your mapping, you should use (replace ... by the patterns above):

"result.sys_created_on" : {
      "type" : "date",
      "format": "..."
    },  

You cannot change the mappings of the fields so you have 2 options:

  • delete the index and reindex everything
  • use the reindex API to reindex the data from the index you have now to another index with the correct mapping (you have to take care of creating the destination index with the correct mapping)

@Luca_Belluccini

Thank you for pointing out the issues. I will make the changes.

One common question what if some of the date fields are empty? Like i have another field result.sys_close_date
This field will have values only when it is closed, while having an empty field logstash will have parse error :frowning:

Any advice on that please

Thanks
Gautham

If you want to skip a field when it's empty:

if [result][sys_close_date] =~ /^$/ {
  mutate { remove_field => [ "[result][sys_close_date]" ] }
} 
1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.