Old files ignored? or indexed again with new Ids?

I have a folder with multiple files with a name date format: "foo_2019_11_23"
I have already created indexes with those files with the folowing command

bin/logstash -f /etc/logstash/conf.d/foo.conf

now there are new files added in that folder, so I want to index those too, if I run the same command again. Will logstash create new indexes with new ids of the old files in that folder, or they will be ignored and only the new files will be added?

this is my conf file:

input {
    file {
            path => "/opt/processed/foo*"
            start_position => "beginning"
            sincedb_path => "/dev/null"

You have told logstash, using 'sincedb_path => "/dev/null"', not to persist knowledge of what it has read and what it has not read, so it will re-read everything.

1 Like

But it will create new ids for the old files?

If you are writing to elasticsearch, then unless you are setting document_id based on the contents of the files, they will get new, unique, ids.

1 Like

So, my old files have 'sincedb_path => "/dev/null" when they were indexed, and there is no way to avoid that logstash create new ids for my old files?

if the last is true, that means that I will have duplicates? same data, different ids?

That is what I would expect.

1 Like