Logstash with queue enabled not ack http input events after jdbc input runs

Hi

I'm using logstash with queuing enabled.

I've setup logstash to inject rows from mysql via the mysql input plugin on startup. Currently this is injecting 1846 rows.

I also have a http input.

When I take down ES and restart logstash as expected I get errors

logstash_1 WARN logstash.outputs.amazones - Failed to flush outgoing items {:outgoing_count=>1, :exception=>"Faraday::ConnectionFailed", :backtrace=>nil}
logstash_1 ERROR logstash.outputs.amazones - Attempted to send a bulk request to Elasticsearch configured at ...

I'd expect when in this situation hitting the logstash http input would result in an ack.

Actually the http POST does not return and the injection is not seen in logstash logs.

My logstash.yaml looks like

queue {
  type: persisted
  checkpoint.writes: 1
  queue.max_bytes: 8gb
  queue.page_capacity: 512mb
}

And my logstash.conf

input {

  jdbc {
    jdbc_connection_string => "${JDBC_CONNECTION_STRING}"
    jdbc_user => "${JDBC_USER}"
    jdbc_password => "${JDBC_PASSWORD}"
    jdbc_driver_library => "/home/logstash/jdbc_driver.jar"
    jdbc_driver_class => "com.mysql.jdbc.Driver"
    statement => "
     	SELECT blah blah blah
    "
  }

  http {
    host => "0.0.0.0"
    port => 31311
  }
}


output {
  stdout { codec => json_lines }
  amazon_es {
    hosts => ["${AWS_ES_HOST}"]
    region => "${AWS_REGION}"
    aws_access_key_id => '${AWS_ACCESS_KEY_ID}'
    aws_secret_access_key => '${AWS_SECRET_ACCESS_KEY}'
    "index" => "${INDEX_NAME}"
    "document_type" => "data"
    "document_id" => "%{documentid}"
  }
}

Is it possible for the http input to still ack events as I'm pretty sure the queue cannot be full as each event payload is about 850 characters?

Thanks in advance

bump?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.