I have one pipeline with multiple output, elasticsearch and kafka.
Now, the requirement is to sent a field to elasticsearch, but while sending to kafka that field needs to drop.
Here my config says input is from elasticsearch and update one field from that set of data and again save it to elasticsearch and send to kafka. But I am not able to delete that field([customcol][flag]) while sending to kafka.
<
elasticsearch {
hosts => ["https://10.1.1.X
:9200","https://10.1.2.X:9200","https://10.1.3.X:9200"]
index => "iece*"
user => "myuser"
password => "mypass"
query => '{"query": { "bool": { "filter": [ { "bool": { "must_not": [ { "match_phrase": { "customcol.flag": "read" } } ], "minimum_should_match": 1 } }, { "range": { "createdon": { "format": "strict_date_optional_time", "gte": "now-1d", "lte": "now" } } } ] } }, "sort": [ { "createdon": { "order": "desc", "unmapped_type": "boolean" } } ] }'
size => 500
scroll => "5m"
docinfo => true
docinfo_target => "[@metadata][doc]"
schedule => "/10 * * * * *"
}
}
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.