INPUT read file after x seconds

Hello,

I use Logstash to read a directory where some log files from other applications are dumped.
(PATH : /data/FLD/files/.../*.log)

Next, thanks to Logstash I filter these files to keep only which interest me.
And then, I put these interesting files into a new directory.

But I've got a problem.

Logs files which arrive in my directory PATH do not arrive all in one piece.
For example the A application writes the log that she has to write in the log 1, then the B application writes in the log 1, then the C. application....

And these operations takes a bit of time.

And I for my needs I would want to open these files with Logstash "Input" only for example 10 minutes after the date of creation of the file, with the aim of being on that all the applications wrote well in the file before exploring it with Logstash.

Is it possible to put this temporization?

Thanks to all.

This is my pipeline :

 input
{
  file
  {
    path => "/data/FLD/files/.../*.log"
    type => "GDAlogATrier"
    codec => multiline {
      pattern => "anepastrouver"
      negate => true
      what => "previous"
     #  auto_flush_interval => 1 (I don't know what does it mean..)
      max_lines => 50000
      charset => "Windows-1252"
    }
    exclude => [ "efluid_*.log", "exploit*.log" ]
    #ignore_older => 120
    close_older => 60
  }
}

filter
{
 if [message] =~ "CODE RETOUR : 0"
 {
   drop { }
 }
 if [message] =~ "CODE RETOUR : -4"
 {
   if [message] !~ "MNOR" and [message] !~ "TECH" and [message] !~ "FTAL"
   {
     drop { }
   }
 }
 fingerprint
 {
    source => ["message"]
    target => "fingerprint_id"
    method => "MURMUR3"
 }
}

output
{
  file
  {
    codec => line
    {
      format => "%{[message]}"
    }
    path => "/home/D_NT_UEM/petotadm/retour/trie-%{[fingerprint_id]}.log"
  }
  stdout { codec => rubydebug }
}

Logstash has nothing built-in for this.

And I for my needs I would want to open these files with Logstash "Input" only for example 10 minutes after the date of creation of the file, with the aim of being on that all the applications wrote well in the file before exploring it with Logstash.

Why does this matter?

Yeah it is important because currently when my Logstash file input plugin read my files, these files aren't completly finish to be write.

So that's why Logstash close the file and doesn't reopen him.

Logstash's file input monitors files continuously and picks up all changes to them. It doesn't matter if the files are complete when Logstash starts processing them. Logstash has no notion of "complete" when it comes to input files.

So that's why Logstash close the file and doesn't reopen him.

What makes you think Logstash is closing the files without reopening them?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.