Add nested field with logstash and JDBC

I am using Logstash to update my Elasticsearch index. I have index called my_files, and I want to update its records with "metadata" field. "metadata" should be a nested field that contains some other fields I want to get from mysql database
I have the following config file

input {
jdbc {
jdbc_connection_string => "jdbc:mysql://"
jdbc_user => "usernem"
jdbc_password => "password"
jdbc_driver_library => "./mysql-connector-java-8.0.12.jar"
jdbc_driver_class => "com.mysql.cj.jdbc.Driver"
statement => "select * from mytable"
use_column_value => true
clean_run => true
tracking_column => id
schedule => "/1 * * * * *"
jdbc_paging_enabled => true
jdbc_page_size => 200000


output {
elasticsearch {
"hosts" => "http://localhost:9200"
"index" => "my_files"
"action" => "update"
"document_id" => "%{id}"
"document_type" => "file"

stdout {
	codec => json_lines


this config files adds the fields from the query directly in the document however I want to make a nested field "metadata" in my_files index and add these fields there. I know that I can use mutate and create a field but I don't know how to do this in an easy way without having to add them one by one.

What will the metadata value look like? Is it dynamic? Does it depend on some value of a field sort of like a SQL JOIN?

In mySQL I have a table "mytable' with 8 fields , 2 of them should not be inserted here (id, and another foreign key), 2 fields of type json, and 4 fields of type char.

You describe the input structure, I asked about the structure you want to add to the document after it is read from the DB.

I would like to have a field called "metadata" and inside this field to have the mentioned 6 fields

You can use jdbc_streaming filter for that, to do what we call enrichment.

The linked doc has quite a good example.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.