How to write a json values from a field in sql into elasticsearch using logstash

I have a table in sql server which has json objects as a value in one of the columns. My requirement is to read those objects as it is and write into an index. I would like to know what are the best ways of doing it using logstash and how it can be achieved.

eg:

type | response

submitapi | {"abc":"xxxx","xyz":{[xxx],[xxx]}
calcapi | {"abc":"xxxx","xyz":{[xxx],[xxx]}
checkinapi | {"abc":"xxxx","xyz":{[xxx],[xxx]}

I need the values under response to be stored into an index without having the 'type' column as key.

thanks

Use the jdbc input to read from the database table. I suppose you want to parse the JSON string in the "response" column so that "abc" and "xyz" end up as separate Elasticsearch fields? Use the json filter for that.

I can use the jdbc plugin but it creates "response" as a key and value as json but i just need the value as an object.

expected in elasticsearch:
"response":"{"abc":"xxxx","xyz":{[xxx],[xxx]}"
required in elasticsearch:
{"abc":"xxxx","xyz":{[xxx],[xxx]}

Thanks

Again, use the json filter.

json {
  source => "response"
}
1 Like