File - failed to open fichier1.log: Unhandled IOException: java.io.IOException: unhandled errno: Too many open files

Hey,

I use logstash to analyse files and drop those which not interest me.

I currently listen : /data/mypath/*.log
But this path contains nearly 15.000 files and I need to analyse them.

So that's why when I launch my pipeline i've got this error :

[WARN ] 2018-06-21 12:23:06.485 [[main]<file] file - failed to open /home/D_NT_UEM/petotadm/batch/WKF999-20180620-20180620-1951-29-722-721-51-2-188241890-20180620-195208.log: Unhandled IOException: java.io.IOException: unhandled errno: Too many open files

and this message for each files in the directory. Can I set a limit to open all these files ?

My pipeline :

input
{
  file
  {
    path => "/data/mypath/*.log"
    type => "GDAlogATrier"
        codec => multiline { pattern => "anepastrouver" negate => true what => "previous" auto_flush_interval => 1 max_lines => 45000 }
  }
}

filter
{
 if [message] =~ "CODE RETOUR : 0"
 {
   drop { }
 }
 if [message] =~ "CODE RETOUR : -4"
 {
   if [message] !~ "MNOR" and [message] !~ "TECH" and [message] !~ "FTAL"
   {
     drop { }
   }
 }
 fingerprint
 {
    source => ["message"]
    target => "fingerprint_id"
    method => "MURMUR3"
 }
}

output
{
  file
  {
    codec => line
    {
      format => "%{[message]}"
    }
    path => "/data/FLD/ELK/workload/trie-%{[fingerprint_id]}.log"
  }
  stdout { codec => rubydebug }
}

Thanks to this pipeline I filter my directory and obtains just interest files in output.

Thx for answer

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.