I'm new to Logstash and i have a simple scenario that I'm trying to execute.
Using the Logstash JDBC plugin I'm fetching data from MySQL and Indexing it in ES.
Before using Logstash I've used the elasticsearch jdbc plugin(jprante) which allowed me to index structured data using "." and even json arrays using brackets("[]").
I'm trying to do the same with the logstash jdbc plugin but its not working.(Its not supporting dots)
I wanted to know how can I achieve the same things that I've done with the ES jdbc plugin using the logstash plugin(i.e. structured json objects with arrays).
I'm not sure how can I use this to solve my issue.
Lets say that I have a query(over MySQL) that I'm using for my input that return the following:
ID FIELD_1 FIELD_2
1 1 1
1 1 2
2 1 1
And I'll like to format them to the following json:
{
doc:[
{
id:1,
group:[{
FIELD_1 : 1 ,
FIELD_2 : 1
},
{
FIELD_1 : 1 ,
FIELD_2 : 2
}
]
},
{
id:2,
group:[{
FIELD_1 : 1 ,
FIELD_2 : 1
}
]
}
]
}
I can achieve that using the elasticsearch-jdbc plugin by "calling" the FIELD_1 name as group[FIELD_1]. i.e.
select .... FIELD_1 as group[FIELD_1] , FIELD_2 as group[FIELD_2].
But I'm not sure how can I use the reference that you sent to get the same. Can you please provide an example?
I don't think there's an easy way of accomplishing this with Logstash. I'd go for a custom script or at least use a custom script to help transform the data—you could still use Logstash for polling the database and shipping to Elasticsearch (or whatever outputs you use).
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.