Right now I'm working with an instance in Ubuntu 14.04, but I've also tried with the SIFT Workstation embodiment of 14.04, Windows 10, and Ubuntu 15. No matter what, and despite days tinkering, I still cannot get logstash to put a CSV file into elasticsearch. Please help!!!!
Here's my configuration file, logstash-l2t.conf, which is stored in /opt/logstash. I've changed /var/log ownership to my username, chopomatic.
`input {
file {
path => ["/home/FeedMeLog2Timeline/*.csv"]
type => "timeline"
start_position => "beginning"
}
}
filter {
csv {
separator => ","
}
}
output {
elasticsearch {hosts => ["127.0.0.1:9200"]
index => "log2timeline"
}
stdout { codec => rubydebug }
}`
I can hit elasticsearch without issue in my browser at localhost:9200 or 127.0.0.1:9200.
I can access kibana without issue in my browser at localhost:5601. It's working, it shows the index I created (log2timeline), and it shows the one record in that index that I sent manually to elasticsearch (via Sense).
Nothing helps. When I go to /opt/logstash and run 'bin/logstash agent -f logstash-l2t.conf --configtest, it returns 'Configuration OK.' But when I run the same command (without --configtest), I get this, and never anything more. It hangs at 'logstash startup completed.' I've left it there as long as overnight with no change and no records added.
chopomatic@ubuntu:~$ cd /opt/logstash chopomatic@ubuntu:/opt/logstash$ sudo bin/logstash agent -f logstash-l2t.conf --verbosesudo: /var/lib/sudo owned by uid 1000, should be uid 0 [sudo] password for chopomatic: Settings: Default filter workers: 2 Registering file input {:path=>["/home/FeedMeLog2Timeline/*.csv"], :level=>:info} No sincedb_path set, generating one based on the file path {:sincedb_path=>"/home/chopomatic/.sincedb_bd0159e016db5402f5fc95c9198868c2", :path=>["/home/FeedMeLog2Timeline/*.csv"], :level=>:info} Worker threads expected: 2, worker threads started: 2 {:level=>:info} Using mapping template from {:path=>nil, :level=>:info} Attempting to install template {:manage_template=>{"template"=>"logstash-*", "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=>{"enabled"=>true, "omit_norms"=>true}, "dynamic_templates"=>[{"message_field"=>{"match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"string", "index"=>"analyzed", "omit_norms"=>true, "fielddata"=>{"format"=>"disabled"}}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"string", "index"=>"analyzed", "omit_norms"=>true, "fielddata"=>{"format"=>"disabled"}, "fields"=>{"raw"=>{"type"=>"string", "index"=>"not_analyzed", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"string", "index"=>"not_analyzed"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"float"}, "longitude"=>{"type"=>"float"}}}}}}}, :level=>:info} New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["127.0.0.1:9200"], :level=>:info} Pipeline started {:level=>:info} Logstash startup completed
ANY Ideas? (I'll be out for a couple hours but will be checking this thread and answering any responses the moment I return.
Thanks!
Chop