Cron job for CSV filter

Hi All,

I am using logstash to upload data into Elastic, we have been using csv filter to upload the data, i need to run this csv file on a every 15 minute frequency.

csv filter doesnot allow "schedule" , how do i set up a cron to be run every 15 mins in csv filter plugin.
I tried start_interval =60 , this is working for first time update, its not updating on a regular basis.

PLease advice, do we have such option to run csv filter in schedule?

Thanks
Gautham

You cannot schedule a filter. You may be able to schedule an input. How are you reading the file?

@Badger Here is my config file,

input {
  file {
    path => "/etc/logstash/http_poller/python/cmdb.csv"
    start_position => "beginning"
   sincedb_path => "/dev/null"
   stat_interval => 60
  }
}
filter {
  csv {
      separator => ","
      columns => ["support_group","affectedApplication","[result][inc_business_service][display_value]","it_application_owner","NodeID","[data][results][Caption]","Business_Criticality","Type"]
  }
}
output {
  elasticsearch {
    hosts => ["1.3.5.2:9200"]
    user => "****"
    password => "*******"
    index => "performance"
  }
#stdout { codec => rubydebug }
}

Thanks
Gautham

A file input does not support a schedule. If you want to read the file every 15 minutes you might be better off using an exec input to cat the file.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.