Date data type and time for kibana

Hi all experts, I'm a beginner of this new techonolgy. I've some doubt about data type:
I used the data like this and using python for interacting I manage to use this script to incorporate data+ time in order to be coherent with the shape of data for elasticseach client

     dt_string=data['DATE']+" "+data['TIME']
    timestamp=datetime.datetime.strptime(dt_string,'%m/%d/%y %H:%M:%S')
    # '04/30/18'+'14:25:40'

Then I insert this information into a json file in order to build a document for inserting it into to an index. When I go on kibana the value of timestamp is recognize as a date form as you can see for the below representation of a snapshot of data. The problem is that with this type of data I can't able to use in any kind of graph/plot. It 's like kibana software is not able to recognize it.
DATE: 04/25/18 DATE.keyword: 04/25/18 TIME: 10:14:03 TIME.keyword: 10:14:03 TIMESTAMP: Apr 25, 2018 @ 12:14:03.000
-Is there a way to incorporate time data in the sense modify it from string to time in order to makes some experiment for plotting data based on time?
Thanks

Check your index mappings, your field appears to be a [keyword]
(Keyword type family | Elasticsearch Guide [7.12] | Elastic) type, while you want it to be mapped as date.

Take a look at the docs :point_up: to see different examples of formats accepted to insert date documents and probably quite interesting for your use case, specify formats directly in the type mapping.

Thanks but I've a doubt about documentation you've posted. I need to specify the value that I would like to modify right? Suppose that I would like to modify TIME using keyword as suggeste by documentation. What I need to insert into dev tools console?

    PUT my-index-000001
{
  "mappings": {
    "properties": {
      "TIME": {
        "type": "date" 
      }
    }
  }
}

Not sure to understand what you mean, but if you are referring to update the mapping of an existing index, that's not recommended. Check this doc page.

It's recommended that you create a new index with the proper mappings and then reindex your existing data into the new one.

ok so if I understand well I need to create another index with the following script:

    
POST _reindex
{
  "source": {
    "index": "my-index-000001",
    "TIME": {
      "type": "string"
    }
  },
  "dest": {
    "index": "my-new-index-000001",
     "TIME": {
      "type": "string"
     }
  }
}

but instead of doing it is not easier to do as following?
```

consumer = KafkaConsumer(
'stream',
bootstrap_servers=['localhost:9092'],
auto_offset_reset='earliest',
enable_auto_commit=True,
group_id='my-group-id',
value_deserializer=lambda x: loads(x.decode('utf-8')))
i=0
indexVal="provaTempi"
es=Elasticsearch([{'host': 'localhost', 'port': 9200}]) 
for event in consumer:
   dt_string=event.value['DATE']+" "+event.value['TIME'] 
   print(dt_string)  
   format1='%m/%d/%y %H:%M:%S'
   data=event.value
   timestampRow=datetime.datetime.strptime(dt_string,format1)
   event_data = {'START TIME': data['START TIME'], 'DATE': data['DATE'], 'TIME': data['TIME'], 'TIMESTAMP': timestampRow, }
   i=i+1
   res=es.index(index=indexVal,id=i, body=event.data)


So according to this script I manage to insert date type into kibana. After creating the patterns on kibana I manage to see the date datatype for TIMESTAMP. With the same pattern of converting date+time I do the same for single like START TIME value as ok?
Thanks

The only doubts about it is that when I went to discover section from overview on kibana and use TIMESTAMP as measurment for built a representation is not able to present any information. I don't really understand if the only way to manage data is the re-index. Thanks for the attention

Find below a complete workflow to test in Dev Tools on creating an index with a text field, then a new index with the date field and your format, then a reindex to copy the documents from the first index into the second as dates, and a simple query to confirm this worked.

# Regenerate the index
DELETE my-index-000001
PUT my-index-000001
{
  "mappings": {
    "properties": {
      "TIME": {
        "type": "text" 
      }
    }
  }
}

# Add some sample docs
POST my-index-000001/_doc
{
  "TIME": "04/30/18 14:25:40"
}
POST my-index-000001/_doc
{
  "TIME": "02/15/18 14:05:40"
}
POST my-index-000001/_doc
{
  "TIME": "02/01/18 14:25:40"
}

# Check the docs
GET my-index-000001/_search

# Regenerate the destination index
DELETE my-index-000002
PUT my-index-000002
{
  "mappings": {
    "properties": {
      "TIME": {
        "type": "date",
        "format": ["MM/dd/yy HH:mm:ss"]
      }
    }
  }
}

# Reindex from the old to the new index
POST _reindex
{
  "source": {
    "index": "my-index-000001"
  },
  "dest": {
    "index": "my-index-000002"
  }
}

# Check the docs with a date range query
GET my-index-000002/_search
{
  "query": {
    "range": {
      "TIME": {
        "gte": "04/15/18 00:00:00"
      }
    }
  }
}

Once the second index is working, you can generate a new index pattern so it's available for Kibana Discover, dashboards, etc.