Logstash 2.2
elasticsearch 1.7.4
I am seeing and issue where I am consuming from kafka and sending to elasticsearch. It consumes about 500 messages at start and then stops with this messages being output to logs
{:timestamp=>"2016-02-26T17:16:33.977000+0000", :message=>"Flushing buffer at interval", :instance=>"#<LogStash::Outputs::ElasticSearch::Buffer:0x3c356a00 @stopping=#<Concurrent::AtomicBoolean:0x56c244e1>, @last_flush=2016-02-26 17:16:32 +0000, @flush_thread=#<Thread:0x6c383c46 run>, @max_size=500, @operations_lock=#<Java::JavaUtilConcurrentLocks::ReentrantLock:0x5ffd4f2b>, @submit_proc=#<Proc:0x2bf99714@/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.5.1-java/lib/logstash/outputs/elasticsearch/common.rb:57>, @flush_interval=1, @logger=#<Cabin::Channel:0x2e4d6053 @subscriber_lock=#<Mutex:0x59ef00e4>, @data={}, @metrics=#<Cabin::Metrics:0x5a23a6dc @channel=#<Cabin::Channel:0x2e4d6053 ...>, @metrics={}, @metrics_lock=#<Mutex:0x5a9fe3bf>>, @subscribers={13010=>#<Cabin::Outputs::IO:0x708f56ae @lock=#<Mutex:0x5283f33d>, @io=#<File:/mnt/log/logstash/logstash.log>>}, @level=:debug>, @buffer=[], @operations_mutex=#<Mutex:0x2b28e3f>>", :interval=>1, :level=>:debug, :file=>"logstash/outputs/elasticsearch/buffer.rb", :line=>"90", :method=>"interval_flush"}
If i output to stdout or file , I am not seeing this issue.