Delete older documents

It is possible to delete existing older document from elasticsearch by schedule using logstash?
My current config file is following,

input {
  jdbc {
    jdbc_driver_library => "mysql-connector-java-5.1.38-bin.jar"
    jdbc_driver_class => "com.mysql.jdbc.Driver"
    jdbc_connection_string => "jdbc:mysql://192.168.10.6/asterisk?user=cron&password=1234"
    jdbc_user => "Croos"
    parameters => {
    
    }
    schedule => "* * * * *"
    clean_run =>true
    statement => "SELECT * from vicidial_log WHERE start_epoch > :sql_last_value"
    use_column_value => true
    tracking_column => start_epoch
  }
}

output {
  stdout { 
    codec => rubydebug 
  }

  elasticsearch {
    hosts => ["localhost:9200"]
  }
}

The recommended way is using time-series indexes, i.e. indexes with a date and/or time in their names. You're already doing that now; your data ends up in logstash-YYYY.MM.DD indexes. Then use Curator to for example delete indexes older than a certain number of days.