I'm developing Kibana (7.11.2) dashboards based on JSON Lines data that is forwarded by a (proprietary) application that I'm also involved in developing.
I'm currently using the application to forward data with "compact" field (JSON key) names, such as indecpup
. However, the application can also forward the same data values with longer, less-cryptic field names, such as independent_enclave_cpu_percent
.
There are pros and cons to these different field names, but, especially for demonstration purposes, I will likely transition to using the longer field names.
The visualizations that I've created so far refer to the compact field names.
I'm considering how to automate migrating my existing Kibana saved objects (including several dozen visualizations/searches) to refer to the longer names. I'd welcome advice on how to do this.
Suppose that I have a table that maps between the compact and longer names.
It occurs to me that I could write a script (say, in Python) that searches an exported .ndjson
file for:
\"field\":\"compact-name\"
and
\"fieldName\":\"compact-name\"
and replaces compact-name with the corresponding longer-name.
I could export all saved objects, run my script on the exported .ndjson
file, and then import the edited version.
I'd delete all existing Elasticsearch indexes containing data with the compact field names, and then ingest new data with the longer field names. (Right now, in development, I don't need to keep any old data with the compact field names.)
I'd need to refresh the corresponding index patterns, to "flush" the old field names and load the new field names. Er... how do I do that in Kibana 7.11.2, or later? I don't see a "Refresh" button when I select an index pattern under Stack Management > Index patterns.
Is that it? Have I missed anything?
Is there an existing solution for doing this?