I am using logstash 2.1.0 and elasticsearch 2.0.0.
Here is my config:
input{
file{
path => "/home/ubuntu/torched_products_addresses.csv"
type => "ats_locations"
start_position => "beginning"
}
}
filter{
csv{
columns => ["latitude", "longitude"]
separator => ","
}
}
output{
elasticsearch {
index=>"ats_member_locations"
hosts => ["localhost:9200"]
}
stdout { codec => rubydebug }
}
When I try to load a CSV file I get the following:
Logstash startup completed
Flushing buffer at interval {:instance=>"#<LogStash::Outputs::ElasticSearch::Buffer:0x9b8ed66 @operations_mutex=#<Mutex:0x77f1221>, @max_size=500, @operations_lock=#<Java::JavaUtilConcurrentLocks::ReentrantLock:0x3408616>, @submit_proc=#<Proc:0x85f0e69@/home/ubuntu/logstash-2.1.0/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.1.4-java/lib/logstash/outputs/elasticsearch/common.rb:54>, @logger=#<Cabin::Channel:0x77cefb59 @metrics=#<Cabin::Metrics:0x7039b380 @metrics_lock=#<Mutex:0x266de755>, @metrics={}, @channel=#<Cabin::Channel:0x77cefb59 ...>>, @subscriber_lock=#<Mutex:0x56b189b7>, @level=:info, @subscribers={12450=>#<Cabin::Outputs::IO:0x6b3db8ec @io=#<IO:fd 1>, @lock=#<Mutex:0x2223bd91>>}, @data={}>, @last_flush=2016-04-04 11:16:26 -0400, @flush_interval=1, @stopping=#<Concurrent::AtomicBoolean:0x78856453>, @buffer=[], @flush_thread=#<Thread:0x37b92a4 run>>", :interval=>1, :level=>:info}
Flushing buffer at interval {:instance=>"#<LogStash::Outputs::ElasticSearch::Buffer:0x9b8ed66 @operations_mutex=#<Mutex:0x77f1221>, @max_size=500, @operations_lock=#<Java::JavaUtilConcurrentLocks::ReentrantLock:0x3408616>, @submit_proc=#<Proc:0x85f0e69@/home/ubuntu/logstash-2.1.0/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.1.4-java/lib/logstash/outputs/elasticsearch/common.rb:54>, @logger=#<Cabin::Channel:0x77cefb59 @metrics=#<Cabin::Metrics:0x7039b380 @metrics_lock=#<Mutex:0x266de755>, @metrics={}, @channel=#<Cabin::Channel:0x77cefb59 ...>>, @subscriber_lock=#<Mutex:0x56b189b7>, @level=:info, @subscribers={12450=>#<Cabin::Outputs::IO:0x6b3db8ec @io=#<IO:fd 1>, @lock=#<Mutex:0x2223bd91>>}, @data={}>, @last_flush=2016-04-04 11:16:27 -0400, @flush_interval=1, @stopping=#<Concurrent::AtomicBoolean:0x78856453>, @buffer=[], @flush_thread=#<Thread:0x37b92a4 run>>", :interval=>1, :level=>:info}
Flushing buffer at interval {:instance=>"#<LogStash::Outputs::ElasticSearch::Buffer:0x9b8ed66 @operations_mutex=#<Mutex:0x77f1221>, @max_size=500, @operations_lock=#<Java::JavaUtilConcurrentLocks::ReentrantLock:0x3408616>, @submit_proc=#<Proc:0x85f0e69@/home/ubuntu/logstash-2.1.0/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.1.4-java/lib/logstash/outputs/elasticsearch/common.rb:54>, @logger=#<Cabin::Channel:0x77cefb59 @metrics=#<Cabin::Metrics:0x7039b380 @metrics_lock=#<Mutex:0x266de755>, @metrics={}, @channel=#<Cabin::Channel:0x77cefb59 ...>>, @subscriber_lock=#<Mutex:0x56b189b7>, @level=:info, @subscribers={12450=>#<Cabin::Outputs::IO:0x6b3db8ec @io=#<IO:fd 1>, @lock=#<Mutex:0x2223bd91>>}, @data={}>, @last_flush=2016-04-04 11:16:30 -0400, @flush_interval=1, @stopping=#<Concurrent::AtomicBoolean:0x78856453>, @buffer=[], @flush_thread=#<Thread:0x37b92a4 run>>", :interval=>1, :level=>:info}
^CSIGINT received. Shutting down the pipeline. {:level=>:warn}
Pipeline shutdown complete. {:level=>:info}
Logstash shutdown completed