Hello all,
I have an ELK Stack 6.2.2 with X-pack trial license. I am running a multiple pipeline configuration with JDBC drivers and sometimes I notice that my data on Elasticsearch is not updated (particularly when multiple pipelines are run simultaneously). I notice this through @timestamp
field in the discover portion of Kibana. Currently I have 7 pipelines that I want to run simultaneously and they are all scheduled to run queries against the database in intervals of 5 seconds.
What do you suggest for my situation? Are intervals of 5 seconds too tight and possibly overwhelm the SQL Server? I wonder if somehow the scheduler can be adjusted to wait for the previous pipelines to finish before executing the next scheduled queries.
Thanks in advance
My pipelines.yml
file is as follows:
## Never give the full path on Windows
- pipeline.id: Training
path.config: "./pipelines/sql_tables_Training.conf"
- pipeline.id: YKBÜ
path.config: "./pipelines/sql_tables_YKBÜ.conf"
- pipeline.id: YKBÜ Session
path.config: "./pipelines/sql_tables_YKBÜ_session.conf"
- pipeline.id: Training Processes
path.config: "./pipelines/sql_Training_processes.conf"
- pipeline.id: YKBÜ Processes
path.config: "./pipelines/sql_YKBÜ_processes.conf"
- pipeline.id: Training Queues
path.config: "/Users/ongun.arisev/Downloads/ElasticStack/logstash-6.2.2/pipelines/sql_Training_queues.conf"
- pipeline.id: YKBÜ Queues
path.config: "./pipelines/sql_YKBÜ_queues.conf"
Each configuration pipeline file is similar and I post here one of them as an example:
input {
jdbc {
jdbc_driver_library => "C:\Program Files\sqljdbc_6.0\enu\jre8\sqljdbc42.jar"
jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
jdbc_connection_string => "jdbc:sqlserver://;integratedSecurity=true"
schedule => "*/5 * * * * *"
jdbc_user => "ongun.arisev"
#statement_filepath => "C:\Users\ongun.arisev\Downloads\ElasticStack\logstash-6.2.2\sqlprocedure.txt"
statement => " EXEC [YKBÜ].[dbo].[BPDS_QueueVolumesNow];"
tags => "YKBÜ_queues"
}
}
filter {
mutate {
add_field => {
"Veritabani" => "YKBÜ"
}
}
}
output {
# stdout { codec => "rubydebug" }
elasticsearch{
hosts => [ "localhost:9200" ]
index => "sql_queues_ykb"
document_id => "%{name}"
user => "logstash_internal"
password => "x-pack-test-password"
}
}