Logstash read the entire file again whenever pipeline reload

Hi all,

What I want is read all the file in a directory and tail for new change and discover new files
Btw, the sincedb file I specified is always 0byte, nothing ever write

My config:

input {
file {
path => "/home/datareceive/elkGetLock/export*"
start_position => "beginning"
sincedb_path => "/home/esdteam/moodleData/getlock.sincedb"
close_older => "15 s"
}
}

Thank you.

Couldn't it be some access permission to this file ?
Have you try to read/write in it with the logstash user ?

Remove
start_position => "beginning"

i have setfacl to the running user and should not be the permission issue

withont start_position => "beginning", I am not able to read the old files within the directory

Anyone could help please?

How are the files updated? Is there anything in the logs?

there is an scp job copying files to the directory whenever there is updates.
which log you mean? logstash?
what log message you are looking for?

Logstash tracks files by inode. If you are copying over a file that has increased in size it will appear as a new file even if it has the same name as it gets a new inode assigned. This is why the file is reread repeatedly.

I take your point, I have handled this using document_id.
So sincedb only works when tailing a particular file? the sincedb is always 0byte as i observe.
Or how it actually works?

The file input expects data to be appended to local files that keep the inode. Since dub should be populated but I wonder if your close_older config affects this. Remove this and see what happens.

My ES cluster is currently down due to disk failure, not sure if it is related the ELK itself and still investigating, I cannot test what you said for now.
If possible, please advice if there is any cases from other users if ELK application causing any disk failure in the past, mine ELK stack is version 6.5.4

Back to the question, it doesn't make sense if it is related to close_older, because if I do not close the file, it would eventually reach the file open limit as the number of file is increasing. My initiation is to open the file again for any new data or updated file modified time and start from the new line instead of the beginning of the file which is depending on the sincedb.

As far as I know Logstash does not work that way so getting Logstash to behave like you describe might not be possible.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.