Need help in removing mysql field name for logstash elastic search json output

Hi,
I want to change the mysql fields id, project, key_type, data, created_at into a json without field name.
Mysql Values are id: 1, project: slug1, key_type: slug2, data: {"key1": "value1", "key2": "value2"}, created_at: 2017-05-25 18:06:24

I want below Json output that will be indexed into Elastic search :
{
"id":1,
"project": "slug1",
"key_type": "slug2",
"key1": "value1",
"key2": "value2",
"created_at": "2017-05-25 18:06:24"
}

My logstash.conf file is :
input {
jdbc {
jdbc_connection_string => "jdbc:mysql://localhost:3306/testdb?autoReconnect=true&useSSL=false"
jdbc_user => "root"
jdbc_password => ""
jdbc_driver_library => "/Users/ashishkumar.p/Downloads/mysql-connector-java-5.1.42/mysql-connector-java-5.1.42-bin.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
statement => "SELECT * from testtable"
}
}
output {
stdout { codec => json_lines }
elasticsearch{
"hosts" => "192.168.0.100:9200"
"index" => "test.migrate"
"document_type" => "data"
"document_id" => "%{uid}"
}
}

Does the data column contain the JSON string '{"key1": "value1", "key2": "value2"}' or what do you mean? To avoid ambiguity I suggest you post exactly what you're currently getting into Elasticsearch (use copy/paste).

Yes, data column contains the json String. Right now i am getting data in Elasticsearch as below
{
"id":1,
"project": "slug1",
"key_type": "slug2",
"data": '{"key1": "value1", "key2": "value2"}',
"created_at": "2017-05-25 18:06:24"
}

What you posted isn't valid JSON so it's still not clear exactly what you have, but you can use the json filter to parse JSON strings that are stored in fields.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.