Bad File descriptor error with Large dictionary file (>10MB) in Logstash

I have multiple log messages in a file which I am processing using logstash filter plugins. Then, the filtered logs are getting sent to elasticsearch.

There is one field called addID in a log message. I want to drop all the log messages which have a particular addID present. These particular addIDS are present in a ID.txt file.

If the addID of a log message matches with any of the addIDs present in the ID.txt file, that log message should be dropped. I am using using a ruby filter for achieving this.

Scenario: Issue is that if the dictionary file that I a using is in MBs then, logstash hangs and I get Bad file descriptor error when I manually stops the pipeline. However, if I use file in KBs, everything works fine.

I have tried changing the LS_HEAP_SIZE to 4g. Nothing worked for me.

Could anyone help me in achieving this?

@magnusbaeck @warkolm
PLease help me Sir

Below is my config file.

input {

    file {
        path => "/Users/jshaw/logs/access_logs.logs
        ignore_older => 0
    }
}

filter {

    grok {

        patterns_dir => ["/Users/jshaw/patterns"]
        match => ["message", "%{TIMESTAMP:Timestamp}+{IP:ClientIP}+{URI:Uri}"]

    }


    kv{
        field_split => "&?"
        include_keys => [ "addID" ]
        allow_duplicate_values => "false"
        add_field => { "IS_BAD_IP" => "false" } 
    }

    if [ClientIP] {
         ruby{
             code => 'if File.open("/Users/jsaw/mapping/badIP.txt").lines.any?{|line|line.include?(event["ClientIP"])}
                  event["IS_BAD_IP"] = "true"
             end'

         }   

         if "true" in [IS_BAD_IP]{
              drop { }
        }     

 }

output {

     elasticsearch{
         hosts => ["localhost:9200"]
     }
}

Drop filter not working ?

@jpcarey

I have tried the translate filter also but I could not achieve my goal using a large dictionary. Do you know any workaround.