Logstash 6.0.0-alpha2 error elasticsearch output (bug?)

Hi,

I've decided to try the pre-released version of the stack (to check changes, improvements, etc.).
So I've downloaded all pre-released versions, stopped my current stack and launch the new stack. I've migrated the logstash conf (and not the data of ES to see the whole pipeline in action).

In my current configuration, it has two inputs to read different files type that are related. So I first read all files from on input, then activate the second and will update (after cloning the event in a ruby code) the existing doc in ES.

The few firsts updates works well, then I got an error (non-fatal, it keeps retrying) on LS. Here is an extract of my config:

input

input {
    file{
        path => "C:/some/path/to/*/*/*.csv"
        tags => ["One"]
    }
  
    file{
        path => "C:/some/path/to/*/*/*.txt"
        tags => ["Two"]
        codec => multiline {
            pattern => "some pattern"
            negate => true
            what => "next"
        }
    }
}

filter

filter{
    if "One" in [tags] {
      ...
   }

  if "Two" in [tags] {
    ...
     elasticsearch {
       hosts => ["localhost:9200"]
       index => "logstash-myla-server"
       query => "doc_id:%{[doc_id]}"
       fields => { 
          "City" => "City"
       }
     }

    if "_elasticsearch_lookup_failure" not in [tags] {
      ruby{
        code => "
          new_event = event.clone
          new_event.set('tags', ['One'])
          new_event_block.call(new_event)
        "
      }
    }

  }
}

output

output {
  stdout {
    codec => rubydebug { metadata => true }
  }

  if "_grokparsefailure" not in [tags] {
        if "One" in [tags] {
          elasticsearch { 			
            hosts => ["localhost:9200"]  
            index => "logstash-one"
            action => "update"
            document_id => "%{doc_id}"
            doc_as_upsert => true
          }
        } else if "Two" in [tags] {
          elasticsearch { 			
            hosts => ["localhost:9200"]  
            index => "logstash-two-%{+YYYY.MM}"
          }
       }
  }
}

And the error I got on output is:

[2017-07-06T16:24:48,660][ERROR][logstash.outputs.elasticsearch] Encountered an unexpected error submitting a bulk request! Will retry. {:error_message=>"undefined local variable or method `event' for #<LogStash::Outputs::ElasticSearch:0x50ca86d9>", :class=>"NameError", :backtrace=>["c:/elk/logstash-6.0.0-alpha2/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-7.3.2-java/lib/logstash/outputs/elasticsearch/common.rb:153:in `submit'", "org/jruby/RubyArray.java:1613:in `each'", "org/jruby/RubyEnumerable.java:974:in `each_with_index'", "c:/elk/logstash-6.0.0-alpha2/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-7.3.2-java/lib/logstash/outputs/elasticsearch/common.rb:131:in `submit'", "c:/elk/logstash-6.0.0-alpha2/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-7.3.2-java/lib/logstash/outputs/elasticsearch/common.rb:91:in `retrying_submit'", "c:/elk/logstash-6.0.0-alpha2/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-7.3.2-java/lib/logstash/outputs/elasticsearch/common.rb:42:in `multi_receive'", "c:/elk/logstash-6.0.0-alpha2/logstash-core/lib/logstash/output_delegator_strategies/shared.rb:13:in `multi_receive'", "c:/elk/logstash-6.0.0-alpha2/logstash-core/lib/logstash/output_delegator.rb:47:in `multi_receive'", "c:/elk/logstash-6.0.0-alpha2/logstash-core/lib/logstash/pipeline.rb:493:in `output_batch'", "org/jruby/RubyHash.java:1342:in `each'", "c:/elk/logstash-6.0.0-alpha2/logstash-core/lib/logstash/pipeline.rb:491:in `output_batch'", "c:/elk/logstash-6.0.0-alpha2/logstash-core/lib/logstash/pipeline.rb:433:in `worker_loop'", "c:/elk/logstash-6.0.0-alpha2/logstash-core/lib/logstash/pipeline.rb:398:in `start_workers'"]}

Is there something I have missed or is this some bug of alpha version?
(I've copied the logstash conf of 6.0.0 to 5.3.0 as I've made small modifications, but it runs well on 5.3.0)

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.