Ruby filter processes events in incorrect order

(Heckler Global Operation) #1

Here is my test configuration
I create a simple list of events like below, the events are numbered according to their order of appearance

2016-12-06 07:00:00,251 event1
2016-12-06 07:00:10,651 event2
2016-12-06 07:02:00,251 event3
2016-12-06 07:05:00,451 event4

In the ruby filter, I define a global variable $messageNo to keep track the number of event

ruby {
code => "
unless defined? $messageNo;
$messageNo = 0
$messageNo += 1;
event.set('messageNo', $messageNo);

In the end, I insert the processed events in Elasticsearch

output {
elasticsearch {
hosts => ["localhost:9200"]
index => "message_index"

But looking at the results in Elasticsearch (pictures attached), the events are not processed in order in Ruby. In the pictures, you can see 'event1' has messageNo 4 while 'event3' has messageNo 1.

(Ed) #2

This is correct, Logstash filters are batch oriented and threaded, there is no way to guarantee order in which they are processed

(Heckler Global Operation) #3

This is a problem to me :frowning: I must find a way to process them in sequence

(Ed) #4

Why not just use the time stamp? As an order mechanism?

(Heckler Global Operation) #5

yes, the @timestamp field is in order, but I'm thinking how to control the order in which Ruby script processes those events.

(Ed) #6

What are you trying to accomplish? you will have to write some other tool as Logstash is not going to be able to do what you want. I think you may have to rethink how to get what you need.

(Magnus B├Ąck) #7

Have you tried setting the number of pipeline workers to one?

(Heckler Global Operation) #8

Thank you, it works !! Reducing the number of workers to 1 allows it to process the events in sequence :slight_smile:
I also find a good tutorial about creating a custom filter to keep track of the event order -

(system) #9

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.