[LogStash] how to intentionally slow down JDBC Input Plugin

Table Of Contents

  1. My situation
  2. What I want
  3. How I can help

1. my situation

  1. I have to run logstash jdbc input against Database running in production.
  2. I use pagination options to limit amount of data per query
  3. but since SQL statements are run almost non-stop
  4. Database metrics spike, affecting performance of other functionality
  5. Slow down the logstash-input-jdbc queries

2. What I need: to slow down query to the database

My Imagination : Could be..... like this diagram below

anything to slow down the queries sent to the databse

You could try using a sleep filter. By default there is an in-memory queue between the input and the pipeline. If the queue fills up then the input stops reading data.

If you want to limit the rate at which the queue is drained (by each worker thread) to 1000 you could try

sleep { time => "0.001" }

(i.e. sleep for a millisecond per event). The event rate will drift around a number below 1000 per second.

Looking at this code, it suggests that

sleep { time => 0.001 }

would also work and be more efficient, but that would very much depend on how the validation works (it may do implicit coercion, it may not).

By limiting the rate at which events are removed from the queue you can limit the rate at which events are pulled from the database and added to the queue.

@Badger thank you for your solution.
sounds like a plan!
I will let you know how it turns out

@Badger it worked! thanks

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.