Logstash not indexing new files

Hi, I have a directory that periodically receive new files from another machine, the files start with cpu and the date, ej: cpu_2019_11_03 and are updated daily. yesterday I create the index in elastic search via logstash, but the new files recived today wherent indexed in elastic search, anyone can tell why this happens?

this is my .conf file

input {
file {
path => "/logs/cpu*"
start_position => "beginning"
sincedb_path => "/dev/null"
ignore_older => 0
}
}

filter {
csv {
columns => [ "horayfecha", "estado", "equipo", "porcentajeCPU" ]
separator => " "
convert => { "porcentajeCPU" => "float" }
}
mutate {
add_field => {"origen" => "alpha"}
convert => {
"porcentajeCPU" => "integer"
}
}

    date {
            match => ["horayfecha", "HH:mm:ss MM/dd/YYYY" ]
            target => "@timestamp"
    }

}

output {
elasticsearch {
hosts => ["foo:9200"]
index => "cpu-%{+YYYY.MM.dd}"
}

   stdout {}

}

On a file input that means to ignore any files more than zero seconds old, which is all files. Remove it.

Hi, thanks for you answer, but still doesnt work, it just index the logs of the current day

The index name is defined by the value of the @timestamp field, which by default is set to the time the event is read from file. If you expect the index name to be based on e.g. a timestamp in the data you need to extract that timestamp into a separate field and use a date filter.