This question has probably been answered a million times, btu I can't find a satisfactory answer, nor do I understand what I'm doing wrong.
Curling elasticsearch I get:
{
"status" : 200,
"name" : "poc",
"cluster_name" : "poc",
"version" : {
"number" : "1.7.2",
"build_hash" : "e43676b1385b8125d647f593f7202acbd816e8ec",
"build_timestamp" : "2015-09-14T09:49:53Z",
"build_snapshot" : false,
"lucene_version" : "4.10.4"
},
"tagline" : "You Know, for Search"
}
So I know elasticsearch is running.
When I startup logstash, with the following config:
input {
file {
path => "/opt/logstash/csv/*.txt"
start_position => "beginning"
}
}
filter {
csv {
columns => ["@timestamp", "value1", "value2", "value3", "value4", "value5", "value6", "value7", "value8", "value9", "value10", "value11", "value12", "value13", "value14"]
separator => ","
}
}
output {
elasticsearch {
host => "10.65.252.126"
port => "9200"
protocol => "http"
index => "logstash-%{+YYYY.MM.dd}"
}
stdout {
codec => rubydebug
}
}
I see the following line repeated, over and over:
_discover_file_glob: /opt/logstash/csv/*.txt: glob is: ["/opt/logstash/csv/1.txt", "/opt/logstash/csv/2.txt"] {:level=>:debug, :file=>"filewatch/watch.rb", :line=>"132", :method=>"_discover_file"}
But using ESHQ, I never see any data inserted into elasticsearch. I've used logstash and elasticsearch before in this exact same context (different config though, I don't remember how I got it to work last time, it's been months, I guess). I don't see any errors, debug doesn't return any problems parsing the csv text file, and of course, I see no data.
is there anything I'm specifically doing wrong, other than having logstash massively mysteriously misconfigured? I wish configtest would throw an error or something, but it's pretty much silent. Really need some help here.