Indexing structured documents using Logstash JDBC plugin

Hi,

I'm new to Logstash and i have a simple scenario that I'm trying to execute.

Using the Logstash JDBC plugin I'm fetching data from MySQL and Indexing it in ES.

Before using Logstash I've used the elasticsearch jdbc plugin(jprante) which allowed me to index structured data using "." and even json arrays using brackets("[]").

I'm trying to do the same with the logstash jdbc plugin but its not working.(Its not supporting dots)

I wanted to know how can I achieve the same things that I've done with the ES jdbc plugin using the logstash plugin(i.e. structured json objects with arrays).

Thanks and best regards,
Orel

Use [field][subfield] to denote nested fields. See https://www.elastic.co/guide/en/logstash/current/event-dependent-configuration.html#logstash-config-field-references. If this doesn't help you have to supply more details.

Hi,

Thanks for the quick response.

I'm not sure how can I use this to solve my issue.

Lets say that I have a query(over MySQL) that I'm using for my input that return the following:
ID FIELD_1 FIELD_2
1 1 1
1 1 2
2 1 1

And I'll like to format them to the following json:
{
doc:[
{
id:1,
group:[{
FIELD_1 : 1 ,
FIELD_2 : 1
},
{
FIELD_1 : 1 ,
FIELD_2 : 2
}
]
},
{
id:2,
group:[{
FIELD_1 : 1 ,
FIELD_2 : 1
}
]
}
]
}

I can achieve that using the elasticsearch-jdbc plugin by "calling" the FIELD_1 name as group[FIELD_1]. i.e.
select .... FIELD_1 as group[FIELD_1] , FIELD_2 as group[FIELD_2].

But I'm not sure how can I use the reference that you sent to get the same. Can you please provide an example?

I hop that its clear enough :slight_smile:

Thanks and best regards,
Orel

I don't think there's an easy way of accomplishing this with Logstash. I'd go for a custom script or at least use a custom script to help transform the data—you could still use Logstash for polling the database and shipping to Elasticsearch (or whatever outputs you use).