All Running but not Parsing or sending to Elasticsearch

Hi All.

I've set up an Logstash and Elasticsearch on a DigitalOcean Droplet. I am looking to index a CSV file in Elasticsearch by putting it through Logstash. Just to get it running, I'm using GREEDYDATA for my grok parsing. It has been tested and works. Here's where I'm at:

My Logstash conf.d file is as follows:

    input {
      file {
        start_position => "beginning"
        path => "/usr/airtime.csv"
    filter {
      grok {
        match => { "message" => "\"%{GREEDYDATA:Timestamp}\"\,%{GREEDYDATA:Network}\,% {GREEDYDATA:Amount}\,%{GREEDYDATA:MobilePhoneContactNumber}\,%{GREEDYDATA:VoucherPIN$" }
        add_field => [ "received_at", "%{@timestamp}" ]
        add_field => [ "received_from", "%{host}" ]
    output {
      elasticsearch { hosts => ["localhost:9200"] }
      stdout { codec => "dots" }
      if "_grokparsefailure" in [tags]{
        stdout { codec => "rubydebug" }

My Elasticsearch is set to and port 9200. I run Logstash using the following command from within /bin:

./logstash -f /etc/logstash/conf.d/logstash-simple.conf

The run is attached as a picture in order to minimize the size of this post. It runs completely to the point where it waits for input from the user (the red line in the image). I've waited for 10 minutes. At this point I press Ctrl-C. No Indexing/parsing is done and Elasticsearch remains empty. No errors in the logstash or elasticsearch logs.

I'm not sure what to do at this point. If you need any further info, please do not hesitate to ask. Any further help will be greatly appreciated.

Thank You!

Logstash has probably processed the file before so it doesn't process it again. start_position => "beginning" only matters for previously unseen files. See what the file input documentation says about sincedb and check out the countless previous posts on this topic.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.