Logstash - grok - ruby and random numbers


maybe someone has an idea here.

we use following setup for our log-filtering:

source -- (dns-rr) --> 3 x logstash -> 3-node-rabbitmq-cluster -> 3 x logstash -> elasticsearch-cluster

on the 3-node rabbitmq-cluster for example we have setup an exchange with x-modulus-hash and e.g. 1 queue on each node which is bound the the exchange.

on the first logstash hosts which are pushing the messages to rabbitmq I use

    ruby {
        code => "event.set('random_number', rand(100).to_s())"

for creating a random-number which is used as the routing-key for the messages. rabbitmq then does some magic with the key submitted and puts the message in on of the queues.

we now experience following on the filtering logstash nodes that fetch those messages from the rabbitmq:

  • logstash filter node 1 has the most load
  • logstash filter node 2 a little bit less
  • logstash filter node 3 even less

now i had a look at ruby's rand-funciton and searched for info about it. the internet says that this function can make problems. and so i looked at the random_number field values in kibana and really - the random number distribution from 1-99 for example is not very nice

so I wanted to ask if anyone here has seen something similar or has some other idea to generate "real" random numbers to get a better distribution of the events?

thanks for any input :slight_smile:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.