Logstash file input : read a file only once

I'm sort of confused, probably on how to use file input for a CSV file. I have a CSV file which is copied over to a dir once every day. I want it to be read only once in a day.

So here's my conf. The issue is I restarted logstash yesterday and the records got indexed. Now today when I copied over the file and overwritten the file, the records were appended to yesterday's index instead of creating a new index.

Should I be using the close_older? But then the default value of 1 hour should close it.

what should I do to ensure the file is read only once every day and a new index is created everyday?
I can set the stat_interval to 86400 to make it stat the file every 24 hours. But why is the new index not created?

input {
  file {
    path => "cat /infra/elk/unixtoelk_inv/inv_aix.csv"
    start_position => "beginning"
    type => invunix
    sincedb_path => "/dev/null"
  }
}

filter {
  if [type] == "invunix" {
    csv {
      separator => ","
      columns => ["Date","Location","HostName","IP_Address","OS","OS_Version"]
    }
    date {
      match => ["Date", "ddMMYYYYHHmmss"]
    }
    mutate {
      remove_field => ["Date", "path", "host", "message"]
    }
  }
}

output {
  if [type] == "invunix" {
    elasticsearch {
      index => "inv-unix-%{+dd-MM-YYYY}"
      hosts => [ "10.15.13.84:9200" ]     
    }
  }
#  stdout { codec => rubydebug }
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.