Persistent queue back pressure

I am using logstash with jdbc input plugin and elastic app search output plugin. I have around 9,600,000 records. Out of which around 8,700,000 records are successfully uploaded to app search but remaining are pending since 24 hours. Logstash is still running. I checked the queue its size is full. It was by default 1024mb size . I tried to increase the size in logstash.yml as I was already running the logstash with -r parameter so it reload the config..but still it is stuck no data is being uploaded to app search. Any clue will be highly appreciated.
here are some logs from the point it last inserted the data to app search

[2022-04-03T14:04:00,688][DEBUG][logstash.agent           ] Converging pipelines state {:actions_count=>0}
[2022-04-03T14:04:01,096][DEBUG][logstash.outputs.elasticappsearch][main][728a28f8dbbffb61ef3da976dc684ee24c5c509cf11f8094c41c32221ef98383] Creating new engine segment in batch to send {:resolved_engine=>"prod-product"}
[2022-04-03T14:04:01,096][DEBUG][logstash.javapipeline    ][main] Shutdown waiting for worker thread {:pipeline_id=>"main", :thread=>"#<Thread:0x5b81d01b dead>"}
[2022-04-03T14:04:01,097][DEBUG][logstash.javapipeline    ][main] Shutdown waiting for worker thread {:pipeline_id=>"main", :thread=>"#<Thread:0x5601ff7b dead>"}
[2022-04-03T14:04:01,097][DEBUG][logstash.javapipeline    ][main] Shutdown waiting for worker thread {:pipeline_id=>"main", :thread=>"#<Thread:0x123fbebd run>"}
[2022-04-03T14:04:02,126][DEBUG][logstash.javapipeline    ][main] Shutdown waiting for worker thread {:pipeline_id=>"main", :thread=>"#<Thread:0x3513c616 dead>"}
[2022-04-03T14:04:02,128][DEBUG][logstash.filters.mutate  ][main] Closing {:plugin=>"LogStash::Filters::Mutate"}
[2022-04-03T14:04:02,133][DEBUG][logstash.pluginmetadata  ][main] Removing metadata for plugin fcb18f22510f32695f50d56c334e7b0c6ad38faba79ea77ea11d6b99bf375e46
[2022-04-03T14:04:02,134][DEBUG][logstash.outputs.elasticappsearch][main] Closing {:plugin=>"LogStash::Outputs::ElasticAppSearch"}
[2022-04-03T14:04:02,137][DEBUG][logstash.pluginmetadata  ][main] Removing metadata for plugin 728a28f8dbbffb61ef3da976dc684ee24c5c509cf11f8094c41c32221ef98383
[2022-04-03T14:04:02,172][DEBUG][logstash.javapipeline    ][main] Pipeline has been shutdown {:pipeline_id=>"main", :thread=>"#<Thread:0x75cba0fa run>"}
[2022-04-03T14:04:02,173][INFO ][logstash.javapipeline    ][main] Pipeline terminated {"pipeline.id"=>"main"}
[2022-04-03T14:04:03,400][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2022-04-03T14:04:03,400][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2022-04-03T14:04:03,686][DEBUG][logstash.config.source.local.configpathloader] Skipping the following files while reading config since they don't match the specified glob pattern {:files=>["/home/centos/logstash-8.1.1/CONTRIBUTORS", "/home/centos/logstash-8.1.1/Gemfile", "/home/centos/logstash-8.1.1/Gemfile.lock", "/home/centos/logstash-8.1.1/LICENSE.txt", "/home/centos/logstash-8.1.1/NOTICE.TXT", "/home/centos/logstash-8.1.1/bin", "/home/centos/logstash-8.1.1/config", "/home/centos/logstash-8.1.1/data", "/home/centos/logstash-8.1.1/jdk", "/home/centos/logstash-8.1.1/lib", "/home/centos/logstash-8.1.1/logs", "/home/centos/logstash-8.1.1/logstash-core", "/home/centos/logstash-8.1.1/logstash-core-plugin-api", "/home/centos/logstash-8.1.1/logstash.yml", "/home/centos/logstash-8.1.1/modules", "/home/centos/logstash-8.1.1/tools", "/home/centos/logstash-8.1.1/vendor", "/home/centos/logstash-8.1.1/x-pack"]}
[2022-04-03T14:04:03,686][DEBUG][logstash.config.source.local.configpathloader] Reading config file {:config_file=>"/home/centos/logstash-8.1.1/logstash.conf"}
[2022-04-03T14:04:03,687][DEBUG][org.logstash.config.ir.PipelineConfig] -------- Logstash Config ---------
[2022-04-03T14:04:03,687][DEBUG][org.logstash.config.ir.PipelineConfig] Config from source, source: LogStash::Config::Source::Local, pipeline_id:: main
[2022-04-03T14:04:03,687][DEBUG][org.logstash.config.ir.PipelineConfig] Config string, protocol: file, id: /home/centos/logstash-8.1.1/logstash.conf
[2022-04-03T14:04:03,687][DEBUG][org.logstash.config.ir.PipelineConfig] 




input {
    jdbc {
        jdbc_driver_library => "/home/centos/temp/mysql-connector-java-8.0.28/mysql-connector-java-8.0.28.jar"
        jdbc_driver_class => "com.mysql.cj.jdbc.Driver"
        jdbc_connection_string =>"jdbc:mysql://****"
        jdbc_user => "***"
        jdbc_password => "***!"
        statement => "SELECT * , UNIX_TIMESTAMP(updated_date) AS unix_ts_in_secs FROM viewname WHERE (UNIX_TIMESTAMP(updated_date) > :sql_last_value AND updated_date < NOW()) ORDER BY updated_date ASC "
        tracking_column => "unix_ts_in_secs"
        use_column_value=>true
        tracking_column_type => "numeric"
        jdbc_paging_enabled => true
        jdbc_page_size => 100000

        }
}
filter {
  mutate {
    gsub => [ "color_id" , " ", "" ]
    add_field => { "[@metadata][_id]" =>"%{product_id}_%{color_id}" }
    remove_field => ["unix_ts_in_secs"]
  }
}


output {
  elastic_app_search {
    api_key => "privatekey"
    url => "url"
    engine => "engine name"
    document_id => "%{[@metadata][_id]}"
  }
}

[2022-04-03T14:04:03,687][DEBUG][org.logstash.config.ir.PipelineConfig] Merged config
[2022-04-03T14:04:03,687][DEBUG][org.logstash.config.ir.PipelineConfig] 

Now it is just repeating this in logs

input {
    jdbc {
        jdbc_driver_library => "/home/centos/temp/mysql-connector-java-8.0.28/mysql-connector-java-8.0.28.jar"
        jdbc_driver_class => "com.mysql.cj.jdbc.Driver"
        jdbc_connection_string =>"jdbc:mysql://****"
        jdbc_user => "****"
        jdbc_password => "****!"
        statement => "SELECT * , UNIX_TIMESTAMP(updated_date) AS unix_ts_in_secs FROM viewSearch WHERE (UNIX_TIMESTAMP(updated_date) > :sql_last_value AND updated_date < NOW()) ORDER BY updated_date ASC "
        tracking_column => "unix_ts_in_secs"
        use_column_value=>true
        tracking_column_type => "numeric"
        jdbc_paging_enabled => true
        jdbc_page_size => 100000

        }
}
filter {
  mutate {
    gsub => [ "color_id" , " ", "" ]
    add_field => { "[@metadata][_id]" =>"%{product_id}_%{color_id}" }
    remove_field => ["unix_ts_in_secs"]
  }
}


output {
  elastic_app_search {
    api_key => "Private"
    url => "url"
    engine => "engine"
    document_id => "%{[@metadata][_id]}"
  }
}

[2022-04-03T14:04:03,688][DEBUG][logstash.agent           ] Converging pipelines state {:actions_count=>0}
[2022-04-03T14:04:06,686][DEBUG][logstash.config.source.local.configpathloader] Skipping the following files while reading config since they don't match the specified glob pattern {:files=>["/home/centos/logstash-8.1.1/CONTRIBUTORS", "/home/centos/logstash-8.1.1/Gemfile", "/home/centos/logstash-8.1.1/Gemfile.lock", "/home/centos/logstash-8.1.1/LICENSE.txt", "/home/centos/logstash-8.1.1/NOTICE.TXT", "/home/centos/logstash-8.1.1/bin", "/home/centos/logstash-8.1.1/config", "/home/centos/logstash-8.1.1/data", "/home/centos/logstash-8.1.1/jdk", "/home/centos/logstash-8.1.1/lib", "/home/centos/logstash-8.1.1/logs", "/home/centos/logstash-8.1.1/logstash-core", "/home/centos/logstash-8.1.1/logstash-core-plugin-api", "/home/centos/logstash-8.1.1/logstash.yml", "/home/centos/logstash-8.1.1/modules", "/home/centos/logstash-8.1.1/tools", "/home/centos/logstash-8.1.1/vendor", "/home/centos/logstash-8.1.1/x-pack"]}
[2022-04-03T14:04:06,686][DEBUG][logstash.config.source.local.configpathloader] Reading config file {:config_file=>"/home/centos/logstash-8.1.1/logstash.conf"}
[2022-04-03T14:04:06,686][DEBUG][org.logstash.config.ir.PipelineConfig] -------- Logstash Config ---------
[2022-04-03T14:04:06,686][DEBUG][org.logstash.config.ir.PipelineConfig] Config from source, source: LogStash::Config::Source::Local, pipeline_id:: main
[2022-04-03T14:04:06,686][DEBUG][org.logstash.config.ir.PipelineConfig] Config string, protocol: file, id: /home/centos/logstash-8.1.1/logstash.conf
[2022-04-03T14:04:06,686][DEBUG][org.logstash.config.ir.PipelineConfig] 

I have also tried to increase queue size in logstash.yml it created a new page in queue/main but it is still stuck and no data is being inserted either in new queue or app search engine
Thanks

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.