facing same issue, removed all indexes, verified that all files and traces of the files are gone.
here is my mapping
{"my-stuff":{"mappings":{"filedetails":{"properties":{"atime":{"type":"double"},"ctime":{"type":"double"},"gid":{"type":"long"},"mode":{"type":"long"},"mtime":{"type":"double"},"name":{"type":"string"},"size":{"type":"long"},"uid":{"type":"long"}}}}}}
is some other index that is part of logstash effecting this?
I have been unable to force mappings on this, yes i'm a noob.
here is the code generating the json, in python
def insert_record(js):
global es
rec=dict()
doc = json.dumps(js, ensure_ascii=True)
res = es.index(index=INDEX, doc_type='filedetails', body=doc)
def store_file(f):
mode=os.stat(f).st_mode
#nlink=os.stat(f).st_nlink
uid=os.stat(f).st_uid
gid=os.stat(f).st_gid
size=os.stat(f).st_size
atime=os.stat(f).st_atime
mtime=os.stat(f).st_mtime
ctime=os.stat(f).st_ctime
js = {
'name' : f,
'mode' : mode,
'uid' : uid,
'gid' : gid,
'size' : size,
'atime' : atime,
'mtime' : mtime,
'ctime' : ctime
}
insert_record(js)
any feedback about how to get this working? or a working index creation json would be welcome. Not sure why something this simple is causing the issue, long isn't exactly a foreign variable type these days?