Hey all,
I am a bit new to logstash.
I have a SQL table that looks like this.
As we can see we have columns 'Address' and 'Skills' in this table that are in JSON format.
I want to load all the data while maintaining the JSON structure of the columns mentioned above.
I have a logstash file that looks like this.
input {
jdbc {
jdbc_connection_string => "jdbc:mysql://db_url"
# The user we wish to execute our statement as
jdbc_user => "dummy_user"
jdbc_password => "pass"
# The path to our downloaded jdbc driver
jdbc_driver_library => "/usr/share/java/mysql-connector-java.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
# our query
statement => "SELECT * FROM db_name.table_name"
}
}
output {
stdout { codec => json_lines }
elasticsearch {
"hosts" => "localhost:9200"
"index" => "idx-nppes"
}
stdout{ codec => rubydebug }
}
When I load my data using this configuration. It looks as follows.
As we can see the JSON data is loaded as string and not JSON.
I tried using JSON filter in the logstash file but with no success.
What changes to I need to make do I need to make to my logstash file for me to load the JSON data as JSON and not as string as well as loading the rest of normal data in the table.