Hello,
I use Logstash to read a directory where some log files from other applications are dumped.
(PATH : /data/FLD/files/.../*.log)
Next, thanks to Logstash I filter these files to keep only which interest me.
And then, I put these interesting files into a new directory.
But I've got a problem.
Logs files which arrive in my directory PATH do not arrive all in one piece.
For example the A application writes the log that she has to write in the log 1, then the B application writes in the log 1, then the C. application....
And these operations takes a bit of time.
And I for my needs I would want to open these files with Logstash "Input" only for example 10 minutes after the date of creation of the file, with the aim of being on that all the applications wrote well in the file before exploring it with Logstash.
Is it possible to put this temporization?
Thanks to all.
This is my pipeline :
input
{
file
{
path => "/data/FLD/files/.../*.log"
type => "GDAlogATrier"
codec => multiline {
pattern => "anepastrouver"
negate => true
what => "previous"
# auto_flush_interval => 1 (I don't know what does it mean..)
max_lines => 50000
charset => "Windows-1252"
}
exclude => [ "efluid_*.log", "exploit*.log" ]
#ignore_older => 120
close_older => 60
}
}
filter
{
if [message] =~ "CODE RETOUR : 0"
{
drop { }
}
if [message] =~ "CODE RETOUR : -4"
{
if [message] !~ "MNOR" and [message] !~ "TECH" and [message] !~ "FTAL"
{
drop { }
}
}
fingerprint
{
source => ["message"]
target => "fingerprint_id"
method => "MURMUR3"
}
}
output
{
file
{
codec => line
{
format => "%{[message]}"
}
path => "/home/D_NT_UEM/petotadm/retour/trie-%{[fingerprint_id]}.log"
}
stdout { codec => rubydebug }
}