Logstash with JDBC performance


I use Logstash with JDBC connector.

I have 6 pipelines defined in my pipeline.yml.

pipeline1 have 14 SQL
pipeline2 have 10 SQL
pipeline3 have 2 SQL
pipeline4 have 8 SQL
pipeline5 have 1 SQL
pipeline6 have 1 SQL

When i want add a new SQL or new pipeline, my server does not work anymore.

System configuration:
4 proc
32 GB

What do you suggest?
More CPU , MEM?
or 1 more logstash?
I will add many new SQL request in the future

Many thanks in advance.



What does this mean? Detail should be needed for further discussion.


Sorry, it's not easy to explain.

All SQL executed by Logstash is no more operational.

Logstash don't retrieve any result SQL when i add a new SQL request in my pipeline.



How about logs when you add a new SQL or new pipeline? Are there something different?

You may enable slowlog and check them.


This new SQL create a new index.

But this SQL connect on 55 systems and retrieve the information needed.

I have a same setup on Logstash DEV but for only 20 connections without any problem.

But with my Logstash Prod with 55 connections the problem occur.

Can you help me to enable Slow Log please?
Is it in log4j2.properties?
What i need specify?

Many thanks in advace.

Slowlog here.

Please use the correct terminology to avoid confusion. What is "new SQL"? A new pipeline with JDBC input plugin?

That said, you may first specify the problem is about adding ANY jdbc requests or a problem of just that specific pipeline.


Config of my pipeline.yml.

cat pipelines.yml

This file is where you define your pipelines. You can define multiple.

For more information on multiple pipelines, see the documentation:

Multiple Pipelines | Logstash Reference [8.0] | Elastic

  • pipeline.id: KPI
    path.config: "/etc/logstash/conf.d/KPI.conf"

  • pipeline.id: check
    path.config: "/etc/logstash/conf.d/CHECK.conf"

  • pipeline.id: Check2
    path.config: "/etc/logstash/conf.d/CHECK2.conf"

  • pipeline.id: Check3
    path.config: "/etc/logstash/conf.d/CHECK3.conf"

  • pipeline.id: IBMi_information
    path.config: "/etc/logstash/conf.d/IBMI_INFORMATION.conf"

  • pipeline.id: RUNCHECK
    path.config: "/etc/logstash/conf.d/RUNCHECK.conf"

  • pipeline.id: RUNCHECKBRMS
    path.config: "/etc/logstash/conf.d/RUNCHECKBRMS.conf"

Each config contain Input output with JDBC config with SQL query.

When i try to add a new conf file in my pipeline, all is very slow very slow.

I hope it's more clear.

Many thanks


What will happen if you add a pipeline with JDBC input of some simple SQL query. You may specify the problem is about adding ANY SQL requests or a problem of just that added specific pipeline.

Are you sure every new SQL queries included in the new pipeline are reasonably fast?? Doesn't it contain too slow queries and congested?

This is the first step of Performance Troubleshooting of Logstash.

  1. Check the performance of input sources and output destinations:
  • Logstash is only as fast as the services it connects to. Logstash can only consume and produce data as fast as its input and output destinations can!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.