How to consume a JSON stream and keep the original stream data unchanged

I am implementing a kibana plugin to check/authorize the request payload. The format of some request payload is JSONStream, an example is as below,


I need to extract all the index names (logstash1* and logstash2* in above example) firstly, then do some check/authorize on the index names, the code is something as below,

import JSONStream from 'JSONStream'; 
import eventStream from 'event-stream';

async function checkIndexNames(request) {
    const payload = request.payload;
    let indexArray = [];
    await eventStream.pipeline(

    // Do some checks on the index names 

Basically the above code works well, all the index names can be extracted correctly. After my plugin finishes checking the payload, then kibana/elasticsearch needs to continue to process the payload.

The issue is that kibana/elasticsearch fails to process the payload, and complains that " [parse_exception] request body or source parameter is required". The reason is that the payload has already been consumed by my plugin, and the original request payload doesn't exist any more. If I comment out the "await eventStream.pipeline(...)", then everything is OK with kibana.

So the question is how can I consume the request payload, and keep the original payload unchanged afterwards? Thanks.

Hey there, unfortunately we don't currently provide support for third-party plugin development because our APIs are still evolving and are liable to break things as they change.



Thanks for the feedback. Actually this is a generic question about nodejs instead of kibana/elasticsearch. It's appreciated if whatever suggestions can be provided.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.