Unable to send data to elasticsearch

This question has probably been answered a million times, btu I can't find a satisfactory answer, nor do I understand what I'm doing wrong.

Curling elasticsearch I get:

{
  "status" : 200,
  "name" : "poc",
  "cluster_name" : "poc",
  "version" : {
    "number" : "1.7.2",
    "build_hash" : "e43676b1385b8125d647f593f7202acbd816e8ec",
    "build_timestamp" : "2015-09-14T09:49:53Z",
    "build_snapshot" : false,
    "lucene_version" : "4.10.4"
  },
  "tagline" : "You Know, for Search"
}

So I know elasticsearch is running.

When I startup logstash, with the following config:

input {  
      file {
          path => "/opt/logstash/csv/*.txt"
          start_position => "beginning"
      }
}

filter {  
    csv {
        columns => ["@timestamp", "value1", "value2", "value3", "value4", "value5", "value6", "value7", "value8", "value9", "value10", "value11", "value12", "value13", "value14"]
        separator => ","
    }
}

output {  
    elasticsearch {
        host => "10.65.252.126"
	port => "9200"
	protocol => "http"
        index => "logstash-%{+YYYY.MM.dd}"
    }
    stdout {
	codec => rubydebug
    }
}

I see the following line repeated, over and over:

_discover_file_glob: /opt/logstash/csv/*.txt: glob is: ["/opt/logstash/csv/1.txt", "/opt/logstash/csv/2.txt"] {:level=>:debug, :file=>"filewatch/watch.rb", :line=>"132", :method=>"_discover_file"}

But using ESHQ, I never see any data inserted into elasticsearch. I've used logstash and elasticsearch before in this exact same context (different config though, I don't remember how I got it to work last time, it's been months, I guess). I don't see any errors, debug doesn't return any problems parsing the csv text file, and of course, I see no data.

is there anything I'm specifically doing wrong, other than having logstash massively mysteriously misconfigured? I wish configtest would throw an error or something, but it's pretty much silent. Really need some help here.

Unless new data is being added to these files, I'd bet that Logstash is waiting for that to happen. start_position => beginning only applies to previously unseen files.

Deleting the .sincedb_p987345761304985720934875 file would cause logstash to find new files, wouldn't it?

Yes, deleting the sincedb file and restarting Logstash should do it.

/opt/logstash/csv/1.txt: stat failed (No such file or directory - /opt/logstash/csv/1.txt), deleting from @files {:level=>:debug, :file=>"filewatch/watch.rb", :line=>"81", :method=>"each"}
:delete for /opt/logstash/csv/1.txt, deleted from @files {:level=>:debug, :file=>"filewatch/tail.rb", :line=>"96", :method=>"subscribe"}
/opt/logstash/csv/2.txt: stat failed (No such file or directory - /opt/logstash/csv/2.txt), deleting from @files {:level=>:debug, :file=>"filewatch/watch.rb", :line=>"81", :method=>"each"}
:delete for /opt/logstash/csv/2.txt, deleted from @files {:level=>:debug, :file=>"filewatch/tail.rb", :line=>"96", :method=>"subscribe"}
_discover_file_glob: /opt/logstash/csv/*.txt: glob is: ["/opt/logstash/csv/3.txt", "/opt/logstash/csv/4.txt"] {:level=>:debug, :file=>"filewatch/watch.rb", :line=>"132", :method=>"_discover_file"}
_discover_file: /opt/logstash/csv/*.txt: new: /opt/logstash/csv/3.txt (exclude is []) {:level=>:debug, :file=>"filewatch/watch.rb", :line=>"141", :method=>"_discover_file"}
_discover_file: /opt/logstash/csv/*.txt: new: /opt/logstash/csv/4.txt (exclude is []) {:level=>:debug, :file=>"filewatch/watch.rb", :line=>"141", :method=>"_discover_file"}
_open_file: /opt/logstash/csv/3.txt: opening {:level=>:debug, :file=>"filewatch/tail.rb", :line=>"118", :method=>"_open_file"}
/opt/logstash/csv/3.txt: sincedb last value 0, cur size 6120118 {:level=>:debug, :file=>"filewatch/tail.rb", :line=>"145", :method=>"_open_file"}
/opt/logstash/csv/3.txt: sincedb: seeking to 0 {:level=>:debug, :file=>"filewatch/tail.rb", :line=>"147", :method=>"_open_file"}
writing sincedb (delta since last write = 15) {:level=>:debug, :file=>"filewatch/tail.rb", :line=>"195", :method=>"_read_file"}
_open_file: /opt/logstash/csv/4.txt: opening {:level=>:debug, :file=>"filewatch/tail.rb", :line=>"118", :method=>"_open_file"}
/opt/logstash/csv/4.txt: sincedb last value 0, cur size 6172095 {:level=>:debug, :file=>"filewatch/tail.rb", :line=>"145", :method=>"_open_file"}
/opt/logstash/csv/4.txt: sincedb: seeking to 0 {:level=>:debug, :file=>"filewatch/tail.rb", :line=>"147", :method=>"_open_file"}
/opt/logstash/csv/3.txt: file grew, old size 0, new size 6120118 {:level=>:debug, :file=>"filewatch/watch.rb", :line=>"96", :method=>"each"}
/opt/logstash/csv/4.txt: file grew, old size 0, new size 6172095 {:level=>:debug, :file=>"filewatch/watch.rb", :line=>"96", :method=>"each"}
_discover_file_glob: /opt/logstash/csv/*.txt: glob is: ["/opt/logstash/csv/3.txt", "/opt/logstash/csv/4.txt"] {:level=>:debug, :file=>"filewatch/watch.rb", :line=>"132", :method=>"_discover_file"}

While logstash was running, I just moved the 1.txt and 2.txt to 3.txt and 4.txt respectively. Logstash saw that and appeared to do some kind of updating, but certainly never communicated the changes with the elasticsearch server. I can tell by ESHQ, though, that logstash communicated in some fashion with the elasticsearch server. I'm just not sure what it was doing.

Logstash is supposed to track renames properly so it should still be tailing 3.txt and 4.txt without emitting any messages. What's the current file offset according to sincedb?

527292 0 64769 0
527296 0 64769 0

Oh. This is weird then. It's still at offset zero. I assume those inode numbers match the actual files? I'd disable the elasticsearch output to make sure that's not what's clogging the pipes (which it should be screaming about in the logs).

I put together a file with the same number of columns, with placeholder data, and logstash immediately picked it up and transmitted it to elasticsearch. So, now I'm trying to understand what's wrong with many thousands of lines of data in the existing files.

Heh. Cut the file down from it's original size (~6MB) and it seems to work. Is there a file size limit for logstash on linux systems?

Is there a file size limit for logstash on linux systems?

None limit intended, and certainly not anywhere near 6 MB.