Failure to import data with logstash into elasticsearch

Using a kafka input, I'm getting the following error when importing data from a topic_id:

{:exception=>#<ArgumentError: comparison of String with 13 failed>, :backtrace=>["org/jruby/RubyComparable.java:168:in `<'", "org/jruby/RubyString.java:1889:in `<'", "file:/opt/logstash/vendor/jruby/lib/jruby.jar!/jruby/java/java_ext/java.util.rb:82:in `[]'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-1.5.1-java/lib/logstash/event.rb:73:in `initialize'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-codec-json-0.1.7/lib/logstash/codecs/json.rb:35:in `decode'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-kafka-0.1.15/lib/logstash/inputs/kafka.rb:169:in `queue_event'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-kafka-0.1.15/lib/logstash/inputs/kafka.rb:139:in `run'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-1.5.1-java/lib/logstash/pipeline.rb:176:in `inputworker'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-1.5.1-java/lib/logstash/pipeline.rb:170:in `start_input'"], :level=>:error}  

The actual data pulled out of a kafka is:

[{"id": 231, "name": "Engineering"}, {"id": 232, "name": "Operations"}, {"id": 233, "name": "Design"}, {"id": 560, "name": "Business"}, {"id": 561, "name": "Legal"}, {"id": 562, "name": "Product"}, {"id": 563, "name": "Local Marketing and Support"}, {"id": 564, "name": "Expansion"}, {"id": 617, "name": "People Operations"}, {"id": 1369, "name": "Finance and Accounting"}, {"id": 1370, "name": "Public Policy and Communications"}, {"id": 1371, "name": "Growth Marketing"}, {"id": 7482, "name": "No Department"}]

The kafka input is:

kafka {
  topic_id => "greenhouse"
  type => "greenhouse"
}

Was able to fix this by importing the data as codec => plain

The problem was the array of JSON maps. The json codec expects one JSON
object per line.