ELK 5--multiline codec not working with imap

Hi,

I keep getting these two messages in my logstash log when trying to run my config file. I have tried just about every combination I can think of and I still cannot get this error to go away. I have run this code with a file input with no problem, but the moment I change to imap, this error occurs. The moment I take away the multiline codec and put a one line input through imap, it works.

[2016-11-08T17:40:27,142][ERROR][logstash.pipeline ] Exception in pipelineworker, the pipeline stopped processing new events, please check your filter configuration and restart Logstash. {"exception"=>#<NoMethodError: undefined method cancelled?' for #<Array:0x1017c494>>, "backtrace"=>["/usr/share/logstash/logstash-core/lib/logstash/util/wrapped_synchronous_queue.rb:186:in each'", "org/jruby/RubyHash.java:1342:in each'", "/usr/share/logstash/logstash-core/lib/logstash/util/wrapped_synchronous_queue.rb:185:in each'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:258:in filter_batch'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:246:in worker_loop'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:225:in start_workers'"]} [2016-11-08T17:40:27,232][FATAL][logstash.runner ] An unexpected error occurred! {:error=>#<NoMethodError: undefined method cancelled?' for #Array:0x1017c494>, :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/util/wrapped_synchronous_queue.rb:186:in each'", "org/jruby/RubyHash.java:1342:in each'", "/usr/share/logstash/logstash-core/lib/logstash/util/wrapped_synchronous_queue.rb:185:in each'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:258:in filter_batch'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:246:in worker_loop'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:225:in start_workers'"]}

My code is as printed below:

input{
imap {
codec => multiline {
pattern => "^\s"
what => previous
}
host => "imap.gmail.com"
user => "XXXX@gmail.com"
password => "XXXXXXX"
port => 993
secure => true
check_interval => 30 # Specified in seconds
fetch_count => 1
}
}

filter {
grok {
match => {"message" => "(?m)%{TIMESTAMP_ISO8601:timestamp} %{LOGLEVEL:severity} %{GREEDYDATA:message}"}
}
}

output {
elasticsearch {
hosts => ["localhost:9200"]
}
}

And a sample input log is below, which I have tested with the grok constructor with no error-- it would be a different error if it was an issue with my filter I'd imagine.

2015-03-24 11:11:58,666 ERROR Processing request failed kafka.common.FailedToSendMessageException: Failed to send messages after 3 tries.
         at kafka.producer.async.DefaultEventHandler.handle(DefaultEventHandler.scala:90)
         at kafka.producer.Producer.send(Producer.scala:77)
         at kafka.javaapi.producer.Producer.send(Producer.scala:42)

I've looked all over and this error/exception doesn't seem to appear on any discussion.

Thanks alot!

So I took multiline out of my code entirely, and my multiline message is still perfectly imported! Is the multiline codec not necessary anymore? I'm happy it's working at least!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.