I am implementing a kibana plugin to check/authorize the request payload. The format of some request payload is JSONStream, an example is as below,
{"index":["logstash1*"],"ignore_unavailable":true,"preference":1523345500966}
{"size":0,"_source":{"excludes":}}
{"index":["logstash2*"],"ignore_unavailable":true,"preference":1523345500966}
{"size":0,"_source":{"excludes":}}
I need to extract all the index names (logstash1* and logstash2* in above example) firstly, then do some check/authorize on the index names, the code is something as below,
import JSONStream from 'JSONStream';
import eventStream from 'event-stream';
async function checkIndexNames(request) {
const payload = request.payload;
let indexArray = [];
await eventStream.pipeline(
payload,
JSONStream.parse(['index']),
eventStream.mapSync(function(data){
indexArray.push(data);
})
);
// Do some checks on the index names
}
Basically the above code works well, all the index names can be extracted correctly. After my plugin finishes checking the payload, then kibana/elasticsearch needs to continue to process the payload.
The issue is that kibana/elasticsearch fails to process the payload, and complains that " [parse_exception] request body or source parameter is required". The reason is that the payload has already been consumed by my plugin, and the original request payload doesn't exist any more. If I comment out the "await eventStream.pipeline(...)", then everything is OK with kibana.
So the question is how can I consume the request payload, and keep the original payload unchanged afterwards? Thanks.