Hi,
I was trying get a logstash filter which will write json object to kafka as array of json object.
As of now with out any filter its writing the json object, but for me i want the kafka events as array of json objects (so that consumer understands it and reads).
this what it is happening now
{"http.method":"GET","http.path":"/serviceB"},"timestamp":1557911843380103,"parentId":"32f100cf25cd47ad","id":"1b61c2205ff61f8f","name":"get","localEndpoint":{"ipv4":"127.0.0.1","serviceName":"sample"},"traceId":"32f100cf25cd47ad"}
But i need it to be with in an array
[{"http.method":"GET","http.path":"/serviceB"},"timestamp":1557911843380103,"parentId":"32f100cf25cd47ad","id":"1b61c2205ff61f8f","name":"get","localEndpoint":{"ipv4":"127.0.0.0","serviceName":"sample"},"traceId":"32f100cf25cd47ad"}]
We have an application which sends the data over http (body is json object ) like below.
I have written the logstash config to listen fof this http request, accept it , one filter is put there to remove the fields which is added by the input http plugin. Then output the json object to kafka topic. Consumers will read it from kafka topic, the problem here is the consumers are reading only array of json object, not the json object as is.. With the current logstash config, am seeing the messages/events in kafka topic is (the same as the application sends, this is trace data generated by zipkin instrumentation in that application)
I did output { stdout { codec = > rubydebug } }, the output looks like below, i need to add square brackets around it , i mean need to put the json object into an array of json object.
OK. You could do it using json_encode (which you will need to install). First copy all the interesting fields to another field, which we can put in [@metadata]
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.