Date Not Recognized

I have json documents that are going into ES just fine, but Kibana doesn't seem to recognize this field; but it records it...

"fields": {
    "": [

I'm sure it's just a configuration thing or a setting, but for the life of me I can't seem to find it...
Spent most of the day yesterday looking for a good guide on this.

I did find this, but can I do this recursively after I've added the data to elasticsearch index?

Will try this. Not sure if will work -

es.index(index='some-index', ignore=400, doc_type='docket', id=i, body=json.loads(docket_content),
		{	mutate 
					convert => [ "", "string" ]
					match => [ "", "ISO8601"]	
					target => ""

Yes, that should help hopefully. Kibana doesn't have great support for nested fields so extracting it should help.

No i get a python error when trying this...

also this error constantly:

Discover: unknown error

Less Infodownload
SearchError: unknown error
at http://:5601/bundles/commons.bundle.js:4:383522
at processQueue (h:5601/built_assets/dlls/vendors.bundle.dll.js:450:200650
at Scope.$digest (http://:5601/built_assets/dlls/vendors.bundle.dll.js:450:210412)
at Scope.$apply (http://:5601/built_assets/dlls/vendors.bundle.dll.js:450:213219)
at done (http://:5601/built_assets/dlls/vendors.bundle.dll.js:450:132717)
at completeRequest (http://:5601/built_assets/dlls/vendors.bundle.dll.js:450:136329)
at XMLHttpRequest.requestError (http:/:5601/built_assets/dlls/vendors.bundle.dll.js:450:135346)

Discover: unknown error is usually caused by something in between kibana and ES. Kibana knows how to read elasticsearch error messages, but outside of that scope it'll respond with unknown error. We should be able to open the dev tools network tab and see what the response is - sometime's it'll be empty if there's a proxy in front closing connections or so on.

Regarding the fields - apologies. That should work for new data coming in, but if not it will have to be reindexed. Logstash can use elasticsearch as the input and output. Alternatively the reindex api with a painless script should be able to help.

Hi @Jim_Palazzolo,

Can you try to convert this field in python script using str(utc_datetime.strftime('%Y-%m-%dT%H:%M:%S.%f')[:-3]
before indexing into Elastic.

If you need to convert the present data, the only is re-indexing



here's the python i'm using to move all the json files to elasticsearch
i'm assuming i could insert that bit of script into this one?

import requests, json, os
from elasticsearch import Elasticsearch

directory = '/usr/share/logstash/misp-json-files'

res = requests.get('http://IP:9200')
print (res.content)
es = Elasticsearch([{'host': 'IP', 'port': '9200'}])

i = 1

for filename in os.listdir(directory):
    if filename.endswith(".json"):
        f = open(filename)
        docket_content =
        # Send the data into es
        es.index(index='resource2', ignore=400, doc_type='docket', id=i, body=json.loads(docket_content))
		i + 1

tried converting to string in the index management section.
didn't help...
well - the error went away, but that went away after i blew away the index and started over...

also - all the ELK stuff sits on the same box, there's no proxy being used at this time.
ELK version 6.8


In your script send an extra field .

  1. get the field from the json object
  2. use the formula of above

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.