Update/Add multiple records in elastic using logstash

We have a requirement,

We have multiple json data like below :

"_id": "1",
"_index": "vnfsdb-20181214",
"_type": "l4",
"_source": {
"@timestamp": "2018-10-01",
"Flagvalue": "1",
"Terminaltype": "T2",
"ObjectFlag": "0",
"Modelname2": "demo2",
"Interceptionsystem": "I_Sat",
"CallingIMEI": "356789831620581",
"Langfound": "bengali",
"Lastmodifieddatetime": "2018-10-01"

We want two operation here from a single logstash conf file,

  1. insert the json file into a index call "alert-20181214".

  2. We want to store the value of only _id in a variable and update the document present in "vnfsdb-20181214 " index.

We want to add two fields into "vnfsdb-20181214 " index based on _id which i stored in a variable.

Can anyone please provide the conf file to make this operation and save my day..

Thanks in advance.


We want to update via logstash output.

output {
elasticsearch {
hosts => [""]
index => "test"
document_type => "type1"
action => "update"
script_lang => "painless"
#script_lang => "groovy"
script_type => "inline"
document_id => "1"
doc_as_upsert => true
script => 'ctx._source.counter = 786'

We want to set counter value 786 in test index .

GET test/type1/1

"_index": "test",
"_type": "type1",
"_id": "1",
"_version": 9,
"found": true,
"_source": {
"counter": 555

Please help.

This is an example of how to achieve your first post:

input {
add_field => {"[@metadata][index]" => "alert-20181214"}

#Clone the event
clone {id => "REMEDY_TRANS_CLONE" clones => ["new-type"]}}

#Clean up the event you want stripped down and sent to new index
if [type] == "new-type" {
#Use prune to remove all fields but the ones you want
prune { whitelist_names => [] }
#Repalce your index value with the new destination
mutate { replace => { "[@metadata][index]" => "vnfsdb-20181214" } }

index => "%{[@metadata][index]}"

Where is the counter value coming from? If you control the document_id you are sending to you can avoid scripting and just update the field with whatever value gets applied in the pipeline. Preserve your id field in the prune whitelist and apply to the document_id => "%{..}" in the output with your update action. Now if you are incrementing every time that document is updated you would need to leverage scripting if the count isn't already in your document. Something like:


This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.