Every day I create an index of my file system with Diskover so I have a large amount of indexes one for every day. Each of those indexes has documents for each file but there is no field for the time the index/document was created. Is there a way to map the creation_date of the index to a new timestamp field in all the existing documents. It looks like for future indexes I can create a default pipeline and use that to automatically do this. But right now I have around 200 existing indexes with a total of ~800 million documents that would need to be updated.
I don't think you can do that easily. But I'm curious about the use case. I'm suspecting a use case which you could solve in another way.
Each index has a document for a folder with the size of that folder on the day it was indexed. I want to have a query to get that document in each index that has the size and graph that in Grafana. This would graph the size of the folder over time. The problem is none of the dates in the document work because they are dates like the time that folder was created/modified which does not change that often even if files in that folder were added/removed. So I need a way to go back and add a new field that is the time that index was created.
And when are you running your index job? Everyday?
It remembers me a bit this blog post: Building a directory map with ELK - David Pilato
Where basically I can run everyday a
ls -lr. In which case you can add the date of the run to the document you are indexing.
I mean, that you need that field for a business reason. Son you should control that field on your side and not rely on an internal technical Elasticsearch field.
Could you share a bit more, like:
- A typical document
- The index names
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.