Thank you Magnus for the quick response. Below is my full configuration and output from logstash. The result I receive is that the tags field of the record is reset with the value I provide in the add_tag option. As per below the tags field is not present in the event created by the jdbc input but exists in the document with the same id in elasticsearch. Could this be the reason?
Logstash Configuration:
input {
jdbc {
jdbc_driver_library => "mysql-connector-java-5.1.45-bin.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_connection_string => "jdbc:mysql://ip:port/schema?zeroDateTimeBehavior=convertToNull"
jdbc_user => "user"
jdbc_password => "password"
jdbc_default_timezone => "UTC"
statement => "SELECT 'account' AS 'record_type', id AS 'record_id' FROM table WHERE TRUE and id = 1880713"
}
}
I don't understand. I've read through your configuration a number of times and the only place where you're adding a tag is the mutate filter adding the "test" tag. Naturally the only tag the document will have in the end is the "test" tag.
Thanks Magnus for your response. I will try to explain the scenario we're testing as I believe it wasn't clear from my comments:
Document with document_id: account_1880713 exists in Elasticsearch and has field tags populated
In Logstash we perform a query using jdbc input filter and depending on the result, we would like to add a tag to the existing document in Elasticsearch, using an update action
upon creating the event in Logstash we do not know the tags that exist already in Elasticsearch, What we are trying to achieve is to append to the existing list of the tags field.
Hope our use case is more clear now, let me know if you need any further info.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.