Hi @Wellington_Bezerra ,
If i understand you correctly you are getting multiple separated events from SQL DB and you want to merge events with the same id under the same document in Elasticsearch.
In order to do so you should include document_id in elasticsearch output in logstash as well as including a script that should run when document should be updated.
hereby is an example of such output:
elasticsearch {
hosts => ["yourhost:9200"]
cacert => "path to cacert if required"
index => "<your index name>"
doc_as_upsert => "true"
action => "update"
script_type => "indexed"
script_lang => ""
script => "<script name on elasticsearch>"
document_id => "%{[ID]}"
user => "user"
password => "pass"
}
notice that action is update and doc_as_upsert is set to true indicating that if ID does not exist on Elasticsearch index it should be created (without the script).
In your script you can refer the indexed data on Elasticsearch using ctx._source and the new event from logstash as params.event.get('any field name from event')
source data can't be an array as you posted and must be an object. assuimg you are going to call the array records - you should manipulate each record in Logstash to look like the following:
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.