Appending csv contents to an existing logstash index

I have a folder where a user will keep adding csv files for logstash to pick up. User will add csv files in this folder, on a daily basis. The csv files will have names like yyyymmdd-dailyUsageData.csv, i.e. the file name has the initial few characters as the date. The contents of the csv file will again have this Date value and some other columns i.e. Date, colA, colB, colC, and so on. The requirement is to load the contents of the csv files, on a daily basis, into logstash/elasticSearch, so as to create the visualisations in kibana.

Though I am able to do so for a single csv file, I am not sure how I can append the contents of the next day's csv file, which has a different timestamp in its name, to the index that has already been created. Below is a sample of the conf file that I created:

input {
      file {
        path => "/etc/logstash/20160302-dailyUsageData.csv"
        type => "usageData"
        start_position => "beginning"
        sincedb_path => "/home/ec2-user/mysincedbfile"
      }
    }
    filter {
      csv {
          separator => ","
          columns => ["Date","colA", "colB"]
      }
      mutate {convert => ["colA", "integer"]}
      mutate {convert => ["colB", "float"]}
    }
    output {  
        elasticsearch {
            action => "index"
            hosts => "localhost:9200"
            index => "dailyUsage"
        workers => 1
        }
        stdout {}
    }

Why do you want it in a single index, it makes more sense to use time based indices really.

However all you need to do is just set your path to a wildcard.

2 Likes

Yes, I changed to *.csv and it worked..Thanks