Flushing buffer at interval error

I am using logstash 2.1.0 and elasticsearch 2.0.0.

Here is my config:

input{


        file{
                path => "/home/ubuntu/torched_products_addresses.csv"
                type => "ats_locations"
                start_position => "beginning"
        }

}

filter{

        csv{
                columns => ["latitude", "longitude"]
                separator => ","
        }

}

output{

 elasticsearch {
                index=>"ats_member_locations"
                hosts => ["localhost:9200"]
        }

stdout { codec => rubydebug }

}

When I try to load a CSV file I get the following:

    Logstash startup completed
Flushing buffer at interval {:instance=>"#<LogStash::Outputs::ElasticSearch::Buffer:0x9b8ed66 @operations_mutex=#<Mutex:0x77f1221>, @max_size=500, @operations_lock=#<Java::JavaUtilConcurrentLocks::ReentrantLock:0x3408616>, @submit_proc=#<Proc:0x85f0e69@/home/ubuntu/logstash-2.1.0/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.1.4-java/lib/logstash/outputs/elasticsearch/common.rb:54>, @logger=#<Cabin::Channel:0x77cefb59 @metrics=#<Cabin::Metrics:0x7039b380 @metrics_lock=#<Mutex:0x266de755>, @metrics={}, @channel=#<Cabin::Channel:0x77cefb59 ...>>, @subscriber_lock=#<Mutex:0x56b189b7>, @level=:info, @subscribers={12450=>#<Cabin::Outputs::IO:0x6b3db8ec @io=#<IO:fd 1>, @lock=#<Mutex:0x2223bd91>>}, @data={}>, @last_flush=2016-04-04 11:16:26 -0400, @flush_interval=1, @stopping=#<Concurrent::AtomicBoolean:0x78856453>, @buffer=[], @flush_thread=#<Thread:0x37b92a4 run>>", :interval=>1, :level=>:info}

Flushing buffer at interval {:instance=>"#<LogStash::Outputs::ElasticSearch::Buffer:0x9b8ed66 @operations_mutex=#<Mutex:0x77f1221>, @max_size=500, @operations_lock=#<Java::JavaUtilConcurrentLocks::ReentrantLock:0x3408616>, @submit_proc=#<Proc:0x85f0e69@/home/ubuntu/logstash-2.1.0/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.1.4-java/lib/logstash/outputs/elasticsearch/common.rb:54>, @logger=#<Cabin::Channel:0x77cefb59 @metrics=#<Cabin::Metrics:0x7039b380 @metrics_lock=#<Mutex:0x266de755>, @metrics={}, @channel=#<Cabin::Channel:0x77cefb59 ...>>, @subscriber_lock=#<Mutex:0x56b189b7>, @level=:info, @subscribers={12450=>#<Cabin::Outputs::IO:0x6b3db8ec @io=#<IO:fd 1>, @lock=#<Mutex:0x2223bd91>>}, @data={}>, @last_flush=2016-04-04 11:16:27 -0400, @flush_interval=1, @stopping=#<Concurrent::AtomicBoolean:0x78856453>, @buffer=[], @flush_thread=#<Thread:0x37b92a4 run>>", :interval=>1, :level=>:info}

Flushing buffer at interval {:instance=>"#<LogStash::Outputs::ElasticSearch::Buffer:0x9b8ed66 @operations_mutex=#<Mutex:0x77f1221>, @max_size=500, @operations_lock=#<Java::JavaUtilConcurrentLocks::ReentrantLock:0x3408616>, @submit_proc=#<Proc:0x85f0e69@/home/ubuntu/logstash-2.1.0/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.1.4-java/lib/logstash/outputs/elasticsearch/common.rb:54>, @logger=#<Cabin::Channel:0x77cefb59 @metrics=#<Cabin::Metrics:0x7039b380 @metrics_lock=#<Mutex:0x266de755>, @metrics={}, @channel=#<Cabin::Channel:0x77cefb59 ...>>, @subscriber_lock=#<Mutex:0x56b189b7>, @level=:info, @subscribers={12450=>#<Cabin::Outputs::IO:0x6b3db8ec @io=#<IO:fd 1>, @lock=#<Mutex:0x2223bd91>>}, @data={}>, @last_flush=2016-04-04 11:16:30 -0400, @flush_interval=1, @stopping=#<Concurrent::AtomicBoolean:0x78856453>, @buffer=[], @flush_thread=#<Thread:0x37b92a4 run>>", :interval=>1, :level=>:info}
^CSIGINT received. Shutting down the pipeline. {:level=>:warn}
Pipeline shutdown complete. {:level=>:info}
Logstash shutdown completed

The "Flushing buffer at interval" message is not an error.

The problem is probably that Logstash thinks that it's at the end of the input file and waiting for more data to be added. Perhaps you need to clear the sincedb file (or set sincedb_path to /dev/null). Please read the file input documentation.

Hi I am seeing the same message, did this get resolved?

Here is what I learned from @magnusbaeck:

He said it could be a problem with the file. So I looked at the
CSV file in wordpad, there was an extra line at the bottom. I removed
that, and it worked.You will also get this error if the file you are trying to process is not there. It happened when I used the wrong file name.

Thanks