Logstash http output to Kafka-REST-Proxy

Hi everyone,
I know there's a kafka output plugin for logstash but I'd like to use Confluent's REST proxy to push it to kafka. But according to the documentation http://docs.confluent.io/3.0.0/kafka-rest/docs/intro.html#produce-and-consume-json-messages I'd need to wrap the json messages like that :

{
   records: [ {
     value: {<-- my json document here ---> }
, ...]
} 

Is there any filter that would allow me to do so ? or do I need to write another plugin ?
The "mapping" part of the http output plugin would seem to be useful for that, but I did not find that many examples on how to use it to take "all of the document" and put it inside records[].value.

Thank you for your time.

do you need to embed the json document or the data structure, into a nested object and then serialize the whole object?

do you actually need like you say:

{
   records: [ {
     value: {<-- my json document here ---> }
, ...]
}

OR

{
   records: [ {
     value: {<-- nested object here ---> }
, ...]
}

And then convert the whole thing to JSON? My question with the first option is that, if the value already has a json document, it will be escaped again, and I assume you simple want to nest an object into a { records: [ { value: object } ] } structure?