Config error

Hello,

I am trying to parse a java log file with grok but the config test is telling me that: The given configuration is invalid. Reason: no implicit conversion of Array into Hash. Putting the config into debug mode I get to see this into the logs:

[2018-02-07T01:49:06,173][ERROR][logstash.agent           ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"TypeError", :message=>"no implicit conversion of Array into Hash", :backtrace=>["org/jruby/RubyHash.java:1706:in `merge!'", "org/jruby/RubyHash.java:1742:in `merge'", "/usr/share/logstash/logstash-core/lib/logstash/compiler/lscl.rb:115:in `block in expr_attributes'", "org/jruby/RubyArray.java:1734:in `each'", "org/jruby/RubyEnumerable.java:936:in `inject'", "/usr/share/logstash/logstash-core/lib/logstash/compiler/lscl.rb:98:in `expr_attributes'", "/usr/share/logstash/logstash-core/lib/logstash/compiler/lscl.rb:76:in `expr'", "org/jruby/RubyArray.java:2486:in `map'", "/usr/share/logstash/logstash-core/lib/logstash/compiler/lscl.rb:280:in `expr_body'", "/usr/share/logstash/logstash-core/lib/logstash/compiler/lscl.rb:226:in `block in expr'", "org/jruby/RubyArray.java:1734:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/compiler/lscl.rb:222:in `expr'", "org/jruby/RubyArray.java:2486:in `map'", "/usr/share/logstash/logstash-core/lib/logstash/compiler/lscl.rb:69:in `expr'", "/usr/share/logstash/logstash-core/lib/logstash/compiler/lscl.rb:48:in `block in compile'", "org/jruby/RubyArray.java:1734:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/compiler/lscl.rb:46:in `compile'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:46:in `compile_ast'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:50:in `compile_imperative'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:54:in `compile_graph'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:12:in `block in compile_sources'", "org/jruby/RubyArray.java:2486:in `map'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:11:in `compile_sources'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:107:in `compile_lir'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:49:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:215:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:35:in `execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:335:in `block in converge_state'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:141:in `with_pipelines'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:332:in `block in converge_state'", "org/jruby/RubyArray.java:1734:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:319:in `converge_state'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:166:in `block in converge_state_and_update'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:141:in `with_pipelines'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:164:in `converge_state_and_update'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:90:in `execute'", "/usr/share/logstash/logstash-core/lib/logstash/runner.rb:362:in `block in execute'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/task.rb:24:in `block in initialize'"]}```

The config file is like this:

  beats {
    # The port to listen on for filebeat connections.
    port => 5044
    # The IP address to listen for filebeat connections.
    host => "0.0.0.0"
  }
}
filter {
   if [type] == "jetty" {
     grok {
        match => { "message" => ["%{MONTH:[system][jetty][month]} %{MONTHDAY:[system][jetty][day]}, %{YEAR:[system][jetty][year]} %{TIME:[system][jetty][time]}%{CRON_ACTION:[system][jetty][day_period]} %{NOTSPACE:[system][jetty][class]} %{WORD:[system][jetty][method]}\n%{GREEDYMULTILINE:[system][jetty][multiline]}"] }
        pattern_definitions => {
          "GREEDYMULTILINE"=> "(.|\n)*"
        }
    }
    mutate {
      add_field => {
        "timestamp" => "%{[system][jetty][month]} %{[system][jetty][day]}, %{[system][jetty][year]} %{[system][jetty][time]} %{[system][jetty][day_period]}"
      }
      split => ["[system][jetty][multiline]", ":"]
      add_field => ["[system][jetty][multiline]", "%{[severity][0]}"]
      add_field => ["[system][jetty][multiline]", "%{[data][-1]}"]
      rename => {"severity" => "[system][jetty][severity]"}
      rename => {"data" => "[system][jetty][data]"}
    }
    date {
      match => ["timestamp", "MMM dd, YYYY KK:mm:ss aa"]
      remove_field => ["timestamp"]
    }
  }
}

Any idea what the issue might be?

Regards,
Peter

The only thing I can think of is that what follows add_field is sometimes an array and sometimes a hash.

Keep in mind that you may have to split your mutate filter since the different options don't execute in the order listed in your configuration. They always run in a fixed order, see below.

Moving the date right after the add_field of the timestamp solved the problem. Thank you for the information.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.