Logstash pod failing to create pipeline due to error loading a ruby script

I am seeing the below error in the logstash log when deploying an eck-logstash pod using Helm. The ruby script is added via a configMap and I have checked that the script has been mounted in the correct directory and has execute permissions set to rwxrwxrwx.

I am unable to work out what is causing the error to be thrown.

[2025-10-15T21:19:54,229][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:imt-pipeline, :exception=>"Java::JavaLang::IllegalStateException", :message=>"Unable to configure plugins: (ScriptError) Error during load of '/usr/share/logstash/scripts/delete-imt-entities.rb': #<RuntimeError: Script does not define a filter! Please ensure that you have defined a filter method!>", :backtrace=>["org.logstash.config.ir.CompiledPipeline.(CompiledPipeline.java:137)", "org.logstash.execution.AbstractPipelineExt.initialize(AbstractPipelineExt.java:236)", "org.logstash.execution.AbstractPipelineExt$INVOKER$i$initialize.call(AbstractPipelineExt$INVOKER$i$initialize.gen)", "org.jruby.internal.runtime.methods.JavaMethod$JavaMethodN.call(JavaMethod.java:847)", "org.jruby.ir.runtime.IRRuntimeHelpers.instanceSuper(IRRuntimeHelpers.java:1379)", "org.jruby.ir.instructions.InstanceSuperInstr.interpret(InstanceSuperInstr.java:139)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:363)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:66)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:128)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:115)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:90)", "org.jruby.RubyClass.newInstance(RubyClass.java:950)", "org.jruby.RubyClass$INVOKER$i$newInstance.call(RubyClass$INVOKER$i$newInstance.gen)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:90)", "org.jruby.ir.instructions.CallBase.interpret(CallBase.java:548)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:363)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:66)", "org.jruby.ir.interpreter.InterpreterEngine.interpret(InterpreterEngine.java:88)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:238)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:225)", "org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:228)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:291)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:324)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:66)", "org.jruby.ir.interpreter.Interpreter.INTERPRET_BLOCK(Interpreter.java:118)", "org.jruby.runtime.MixedModeIRBlockBody.commonYieldPath(MixedModeIRBlockBody.java:136)", "org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:66)", "org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:58)", "org.jruby.runtime.Block.call(Block.java:144)", "org.jruby.RubyProc.call(RubyProc.java:354)", "org.jruby.internal.runtime.RubyRunnable.run(RubyRunnable.java:111)", "java.base/java.lang.Thread.run(Thread.java:1583)"], :cause=>{:exception=>Java::OrgJrubyExceptions::StandardError, :message=>"(ScriptError) Error during load of '/usr/share/logstash/scripts/delete-imt-entities.rb': #<RuntimeError: Script does not define a filter! Please ensure that you have defined a filter method!>", :backtrace=>["RUBY.load(/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/logstash-filter-ruby-3.1.8/lib/logstash/filters/ruby/script.rb:20)", "RUBY.initialize(/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/logstash-filter-ruby-3.1.8/lib/logstash/filters/ruby.rb:55)", "org.logstash.plugins.factory.ContextualizerExt.initialize(org/logstash/plugins/factory/ContextualizerExt.java:97)", "org.jruby.RubyClass.new(org/jruby/RubyClass.java:950)", "org.logstash.plugins.factory.ContextualizerExt.initialize_plugin(org/logstash/plugins/factory/ContextualizerExt.java:80)", "org.logstash.plugins.factory.ContextualizerExt.initialize_plugin(org/logstash/plugins/factory/ContextualizerExt.java:53)", "org.logstash.plugins.factory.PluginFactoryExt.filter_delegator(org/logstash/plugins/factory/PluginFactoryExt.java:73)", "org.logstash.plugins.factory.PluginFactoryExt.plugin(org/logstash/plugins/factory/PluginFactoryExt.java:250)", "org.logstash.execution.AbstractPipelineExt.initialize(org/logstash/execution/AbstractPipelineExt.java:236)", "RUBY.initialize(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:47)", "org.jruby.RubyClass.new(org/jruby/RubyClass.java:950)", "RUBY.execute(/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:50)", "RUBY.converge_state(/usr/share/logstash/logstash-core/lib/logstash/agent.rb:431)"]}}

Hello,

Can you share the pipeline configuration and the ruby script? Is pretty complicated to provide any insight without seening both the pipeline configuration and the script.

Below copies of the pipeline configuration and the ruby script that is failing to load.

Pipeline as created in the pod

  • bash-5.1$ cat pipelines.yml
    - config.string: |
        input {
          http {
            port => 5047
            tags => imtdata
          }
    
          http {
            port => 5048
            tags => imtgroups
          }
        }
    
        filter {
          ruby {
            code => '
              str = event.get("message")
              new_event = "imt_xml"
              str1 = CGI.unescape(str)
              substr1 = "<crawl-urls"
              substr2 = "</crawl-urls>"
              substr3 = "crawl-delete"
              start_index = str1.index(substr1)
              xml_str = str1[start_index..-1]
              end_index = xml_str.index(substr2) + 13
              xml_str.slice!(end_index..)
              is_delete = xml_str.index(substr3)
              if is_delete
                event.tag("deletedata")
              end
              event.set(new_event, xml_str)
            '
          }
        }
    
        filter {
          ruby {
            path => "/usr/share/logstash/scripts/delete-imt-entities.rb"
          }
          mutate { remove_tag => ["imtdata", "imtgroups"] }
          if [delete][index] {
            mutate { add_tag => [ "deletedata"]}
          }
          else {
            drop {}
          }
        }
    
        filter {
          if "imtdata" in [tags] {
            ruby {
              path => "/usr/share/logstash/scripts/extract-imt-entities.rb"
            }
            if [meta] {
              split { field => "meta" }
              ruby {
                path => "/usr/share/logstash/scripts/get-imt-attachment.rb"
              }
              mutate { add_field => { "[@metadata][target_index]" => "imt_document_%{+YYYY}" }}
              mutate { add_tag => [ "imtdata" ]}
              if "_split_type_failure" in [tags] { drop{}}
            }
          }
        }
    
        filter {
          date {
            match => ["[meta][documentDate]", "ISO8601"]
            target =>["@timestamp"]
          }
        }
    
        filter {
          if "imtgroups" in [tags] {
            xml { source => "imt_xml" target => "[@metadata][theXML]" force_array => false remove_field => [ "imt_xml" ] }
            split { field => "[@metadata][theXML][crawl-url]" }
            ruby {
              path => "/usr/share/logstash/scripts/extract-imt-groups.rb"
            }
            mutate { add_tag => [ "imtgroups" ]}
          }
        }
    
        output
          {
            stdout {
              codec => rubydebug
            }
    
            if "imtdata" in [tags] {
              elasticsearch {
                data_stream => false
                hosts => ["https://10.104.118.164:9200"]
                ssl_certificate_authorities => "/usr/share/logstash/config/ca.crt"
                index => "%{[@metadata][target_index]}"
                doc_as_upsert => true
                document_id => "%{[meta][uuid]}"
                user => "elastic"
                password => "xxx"
              }
            }
    
            if "imtgroups" in [tags] {
              elasticsearch {
                data_stream => false
                hosts => ["https://10.104.118.164:9200"]
                ssl_certificate_authorities =>  "/usr/share/logstash/config/ca.crt"
                index => "imt_security_groups"
                doc_as_upsert => true
                document_id => "%{[imt_groups][0][GroupName]}"
                user => "elastic"
                password => "xxx"
              }
            }
    
            if "deletedata" in [tags] {
              elasticsearch {
                data_stream => false
                hosts => ["https://10.104.118.164:9200"]
                ssl_certificate_authorities => "/usr/share/logstash/config/ca.crt"
                index => "%{[delete][index]}"
                action => delete
                document_id => "%{[delete][vse-key]}"
                user => "elastic"
                password => "xxx"
              }
            }
          }
      pipeline.id: imt
    bash-5.1$
    
    

Copy of the ruby script:

require 'nokogiri'
require 'elasticsearch'

def filter(event)
  new_record_event = LogStash::Event.new
  xml_str = event.get('imt_xml')
  client = Elasticsearch::Client.new( hosts: [{ host: 'https://10.104.118.164', port: '9200', user: 'elastic', password: 'xxx', scheme: 'https' }],
    transport_options: { ssl: { verify: false }})

  doc = Nokogiri::XML(xml_str)
  out_object = {}
  vse_key = doc.xpath('/crawl-urls/crawl-delete/@vse-key').text
  out_object['vse-key'] = vse_key
  out_object['action'] = 'delete'

  response = client.search( index: 'imt_*', body: { match: { _id: vse_key}}})
  index = response.dig('hits', 'hits', 0, '_index')

  out_object['index'] = index
  new_record_event.set('delete', out_object)
  return [new_record_event]
end

I have no problem getting 9.1.4 to load that script provided I remove one of those }

Is that deploying to a kubernetes cluster?

No, just running logstash on the command line in a cloud server.

I suspect the problem is that /usr/share/logstash/scripts/delete-imt-entities.rb does not contain what you want it to contain. An empty .rb file will produce the error you are getting.

I removed the extra “}” but still get the same error. I’m sure it’s something to do with the way the script is being deployed into the container using the configMap. If I remove the filter with the first ruby script (delete-imt-enitities.rb) then it just throws the same error on the next ruby script in the next filter.

I have worked out what was causing the error. I was adding the ruby scripts to a ConfigMap by using the {{ .Files.Get “path/to/your/file” | quote }} construct and this meant that the scripts ended up encased inside quotes which was why they weren’t being read properly when the pipeline was being started.

Once I just added the ruby code directly into the configMap yaml file they loaded up fine and the pipeline started.