Logstash 5.0 Alpha 4 csv input and mutate split error

Hi - I have simple .csv input files and a Logstash config that worked fine with Logstash 2.3.x, but when trying to use the same input files and config file in Logstash 5.0 aplha 4 produces the following error and crashes the pipeline:

An unexpected error occurred! {:error=>#<NoMethodError: undefined method eventget' for #<LogStash::Filters::Mutate:0x6f66ad48>>, :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-mutate-3.0.1/lib/logstash/filters/mutate.rb:390:in split'", "org/jruby/RubyHash.java:1342:in each'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-mutate-3.0.1/lib/logstash/filters/mutate.rb:389:in split'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-mutate-3.0.1/lib/logstash/filters/mutate.rb:226:in filter'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:155:in multi_filter'", "org/jruby/RubyArray.java:1613:in each'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:152:in multi_filter'", "/usr/share/logstash/logstash-core/lib/logstash/filter_delegator.rb:42:in multi_filter'", "(eval):164:in initialize'", "org/jruby/RubyArray.java:1613:in each'", "(eval):159:in initialize'", "org/jruby/RubyProc.java:281:in call'", "(eval):179:in initialize'", "org/jruby/RubyArray.java:1613:in each'", "(eval):175:in initialize'", "org/jruby/RubyProc.java:281:in call'", "(eval):123:in filter_func'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:287:in filter_batch'", "org/jruby/RubyArray.java:1613:in each'", "org/jruby/RubyEnumerable.java:852:in inject'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:285:in filter_batch'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:237:in worker_loop'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:210:in start_workers'"], :level=>:fatal}

The input lines are all in csv files and the contents are all like:

d 2016-07-18 12:47:06+0100 2016-07-18 12:47:05+0100 bo-1R8-4D9LK-3KH3KG-C0469@cmp.dotmailer.co.uk phil.mcnally@rsmtenon.com relayed 2.0.0 (success) smtp 250 ok 1468842426 qp 4814 server-13.tower-208.messagelabs.com!1468842425!21681222!1 cluster1.eu.messagelabs.com (195.245.230.83) smtp sov-dm-droid8 (10.20.101.219) smtp 172.20.114.14 195.245.230.83 PIPELINING,8BITMIME,STARTTLS 42751 105-114 RSM UK - Home dotMailerNeutral9

The config file that fails is:

input {
file {
path => [ "/var/log/pmta/powermta07/*.csv" ]
tags => [ "pmta4.0" ]
sincedb_path => "/etc/logstash/db/sincedb-pmtav4-powermta07"
sincedb_write_interval => "15"
discover_interval => "15"
start_position => "beginning"
}
}

filter {

Set CSV columns

if "pmta4.0" in [tags] {
csv {
columns => [
"msgtype",
"timeLogged",
"timeQueued",
"orig",
"rcpt",
"orcpt",
"dsnAction",
"dsnStatus",
"dsnDiag",
"dsnMta",
"bounceCat",
"srcType",
"srcMta",
"dlvType",
"dlvSourceIp",
"dlvDestinationIp",
"dlvEsmtpAvailable",
"dlvSize",
"vmta",
"jobId",
"envId",
"queue",
"vmtaPool"
]

  	add_field => { "domain" => "%{rcpt}" }
  	add_field => { "origDomain" => "%{orig}" }
  	add_field => { "AccountID" => "%{orig}" }
  	add_field => { "dlvPublicIp" => "%{dlvSourceIp}" }
  	##--  csv-pmta tag - TO BE REMOVED  --##
  	add_tag => "csv-pmta"
  }
  # Check if CSV was parsed correctly
  if [srcType] == "smtp" and "@" in [rcpt] {
  	# Set message timestamp using timelogged field
  	date {
  		match => [ "timeLogged", "YYYY-MM-dd HH:mm:ssZ" ]
  	}
  	# Re-write fields that we search on to lowercase as we use not_analyzed strings
  	mutate {
  		lowercase => ["rcpt"]
  		lowercase => ["origDomain"]
  		lowercase => ["domain"]
  	}
  	# Split fields to arrays using seperators
  	mutate {
  		split => { "domain" => "@" }
  	}
  }

}
}

}

output {
if "csv-pmta" in [tags] {
elasticsearch {
hosts => "192.168.0.231"
index => "powermta-%{msgtype}-%{+YYYY.MM.dd}"
document_type => "powermta"
template => "/etc/logstash/powermta-template.json"
template_name => "powermta"
template_overwrite => true
document_id => "%{@uuid}"
flush_size => "15000"
}
}
else {
elasticsearch {
hosts => "192.168.0.231:9200"
index => "pmta_incomplete-%{msgtype}-%{+YYYY.MM.dd}"
}
}
}

However, if I exclude the:

  	mutate {
  		split => { "domain" => "@" }
  	}

The error goes away and the logs are processed.

There's a lot more processing to be done after this part of the config, but I can't get past this issue to complete the rest of the processing

Any ideas would be greatly appreciated!!!

Thanks in advance,

TimC

Sorry the full error code truncated it is:

An unexpected error occurred! {:error=>#<NoMethodError: undefined methodeventget' for #LogStash::Filters::Mutate:0x6f66ad48>, :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-mutate-3.0.1/lib/logstash/filters/mutate.rb:390:in split'", "org/jruby/RubyHash.java:1342:ineach'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-mutate-3.0.1/lib/logstash/filters/mutate.rb:389:in split'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-mutate-3.0.1/lib/logstash/filters/mutate.rb:226:infilter'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:155:in multi_filter'", "org/jruby/RubyArray.java:1613:ineach'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:152:in multi_filter'", "/usr/share/logstash/logstash-core/lib/logstash/filter_delegator.rb:42:inmulti_filter'", "(eval):164:in initialize'", "org/jruby/RubyArray.java:1613:ineach'", "(eval):159:in initialize'", "org/jruby/RubyProc.java:281:incall'", "(eval):179:in initialize'", "org/jruby/RubyArray.java:1613:ineach'", "(eval):175:in initialize'", "org/jruby/RubyProc.java:281:incall'", "(eval):123:in filter_func'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:287:infilter_batch'", "org/jruby/RubyArray.java:1613:in each'", "org/jruby/RubyEnumerable.java:852:ininject'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:285:in filter_batch'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:237:inworker_loop'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:210:in start_workers'"], :level=>:fatal}

I experience the same issue on my setup. @tcaudrey, have you found a solution in the meantime?

Sort of... Apparently it's been fixed in the new version of the new mutate filter - but I'n not sure how to do the update?!?!

See Here:

https://discuss.elastic.co/t/how-to-update-to-new-logstash-filter-mutate-plugin-version/58366

Ta,

Tim.