Does First Input First Output in logstash

Can logstash ensure the message first read by input-plugin can be output firstly with output-plugin?
AND
If if the output-plugin is elasticsearch, does the the docs write into elasticsearch by the same sequence with the input-plugin read?
BECAUSE,
I want to update the previous data with the data behind.
Anyone can help, Thanks!!!

The worker threads can arrive at an output in any order because the any thread can be delayed relative to the other.

For strict in-order processing you need to have only 1 worker thread. Setting pipeline.workers.
For version 5.6 read https://www.elastic.co/guide/en/logstash/5.6/logstash-settings-file.html
For version 6.0 read https://www.elastic.co/guide/en/logstash/6.0/logstash-settings-file.html

HOWEVER If you are trying to overwrite earlier documents with later documents based on some identifier value then you need a different strategy.

If you are looking for a kind of upsert, replace, overwrite or merge type action please say so, we can then help further.

1 Like

Thanks!

YES, I want to delete the earlier document, any good advice? Tanks!!!
When the field1 is "0" I index a document, then when document with the same value of field2 comes and field1 != "0" , I want to delete the document created before.
I use this configure now:

input{}
filter{
    if [field1] == "0"{
        mutate{
            add_field =>{
                "[@metadata][action]" => "index"
            }
        }
    }else {
        mutate{
            add_field =>{
                "[@metadata][action]" => "delete"
            }
        }
		}
}
output{
    elasticsearch { 
			hosts => ["XXXX"]
			index => "XXXX"
			document_id => "%{field2}"
			action => "%{[@metadata][action]}"
			document_type => "%{type}"
			flush_size => 300
			idle_flush_time => 10
			sniffing => false
			codec => "json"
		}
}

This link shows that you can use the document_id to overwrite the previous doc with the index action.

However, you have some logic to implement and a consideration of timing, I think.

I'm confused. Your first post says that you want to update older docs but your newer post seems to say this:
Documents will only remain in the index if no other docs having the same id AND field1 != 0 are ingested.

That is my goal. In my first post, I when I say update the data I mean delete it, sorry for not saying clearly.
So, if I want to use logstash to achieve my goal, setting pipeline.workers to 1 will help.
I will try it.
Thanks for patiently answering my question!

Good luck! make sure you have the correct spelling in document_id setting.

field2 is not what I actually used, it's a an example. Anyway, thank you for your reminder,
I will correct it.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.