I have an index pttern that has a huge number of fields and I want to clean it up by removing the unused fields. The way that I'm using to find out if a field is used or not is by searching for the field in the available fields using the search option in kibana and if I don't see the field in there then I assume that it's not used like for example in the picture I've searched for the 'card' field and I didn't get any results so I'm assuming that this field is unused in the 4.9 billion log entries and therefore can be deleted. Is my approach correct or there are cases where this doesn't work, thanks.
This is not a logstash question, so this is the wrong forum.
That said, you could use the mapping API to get the mapping, then parse that result to extract all the field names, then using an exists query for each one and see if there are any documents in the result set.
thanks however I have 10k t´field in the index pttern so I'm looking for more a practicle approach
Not sure why you do not think it is practical. It requires some basic scripting, which you could actually do in logstash (a generator input to flush an event to the pipeline, an http filter to fetch the mapping, a json filter to parse the mapping, a ruby filter to iterate over the mapping and shove all the field names into an array, a split filter to blow that out into one event per field, another http filter to run the exists query against logstash for each field, and another json filter to parse it).
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.