I have a streaming process that feeds an ELK index (one new index each day)
The object send is a JSON with multiple level and fields. I have a complex mapping on this Index.
I want to expose a limited part of this object to another index. How can I do that ? can I use an ingest pipeline to copy a part of the object to another index each time one is write ? Is there something like a "sql view" ?
The goal is to expose a limited object to some Kibana user.
I know that you can have permission for each field but I don't want to manage permission field by field. (too many of them)
You can create aliases to select which documents an alias could expose. That would be like a WHERE clause on SQL.
But to filter the fields I'm afraid you can only do this reindexing the data, so you can do both select the documents and the fields you want to copy to the destination index. The other option would be to index your data twice at ingest time.
It is easier to make this during the indexing process, in one index you would send the full document and to the other index you would send the limited document.
If you are using Logstash this can be done easily.
You would need to add this logic to your java code then.
Basically you will create two independent indices, one with your full document and other with the limited document so you need to see if it is worth to do that.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.