I am trying to load from a SQL to Elastic using Logstash pipeline. I am trying to add new nested document using mutate, but it is overwriting existing one.
If you want to append an entry to an array in elasticsearch when a different set of projectInfo entries are processed by logstash that is probably an elasticsearch question. It may be possible to do it using a scripted upsert (I do not know), otherwise you would have to fetch the document from elasticsearch, possibly using an elasticsearch filter, merge the additional information (probably requiring a ruby filter) and then send it back to elasticsearch.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.