Hello everyone !
I have a streaming process that feeds an ELK index (one new index each day)
The object send is a JSON with multiple level and fields. I have a complex mapping on this Index.
I want to expose a limited part of this object to another index. How can I do that ? can I use an ingest pipeline to copy a part of the object to another index each time one is write ? Is there something like a "sql view" ?
The goal is to expose a limited object to some Kibana user.
I know that you can have permission for each field but I don't want to manage permission field by field. (too many of them)
thanks for your guidance.
You can create aliases to select which documents an alias could expose. That would be like a
WHERE clause on SQL.
But to filter the fields I'm afraid you can only do this reindexing the data, so you can do both select the documents and the fields you want to copy to the destination index. The other option would be to index your data twice at ingest time.
To be honest, I think the best option is to use the field level security system.
Thanks a lot for your answer, I was hoping that I have miss an option
Seems not, we will manage with one of this option. Thanks again.
How are you indexing it?
It is easier to make this during the indexing process, in one index you would send the full document and to the other index you would send the limited document.
If you are using Logstash this can be done easily.
It's a JAVA code running on a kubernetes cluster, so it's using the java api client
You would need to add this logic to your java code then.
Basically you will create two independent indices, one with your full document and other with the limited document so you need to see if it is worth to do that.
thanks for your answer, I was looking for a solution that avoid me to duplicate data but it's doesn't seems possible.