Codec msgpack : Trouble parsing msgpack input

i am trying to collector data from kafka by logstash. The data is encoded as msgpack format. however, i use codec msgpack in input-plugin but failed.
here is my logstash config:
input {
kafka {
zk_connect => "localhost:2181"
topic_id => "test_topic"
group_id => "test_logstash"
consumer_threads => 3
codec => "msgpack"
}
}

output {
elasticsearch {
hosts => "localhost:9200"
index => "test_app_package"
document_type => "log"
}
}

the logs of logstash are:

Trouble parsing msgpack input, falling back to plain text {:input=>"{"host":"tdcv3.talkingdata.net","ip":"41.220.68.139","url":"\/g\/d","headers":{"cookie":"Bearer-Type=w-TCP","connection":"keep-alive","accept":"\/","accept-encoding":"gzip, deflate","host":"tdcv3.talkingdata.net","x-forwarded-proto":"http","content-length":"701","wap-connection":"Stack-Type=HTTP","x-forwarded-port":"80","accept-charset":"","x-forwarded-for":"41.220.68.139"},"remote_addr":"172.31.12.60","ts":1463480497285}\x00\x96\xDA\x00!3de6db1210f7b83c5a32f449b700c05d2\xDA\x00 EC3785AF982EA9370A87558A565C820F\x99\xB7com.android.ys.services\xA33.6\xA236\xCF\x00\x00\x01Tx\x95t\n\xB2Android+TD+V1.2.61\xA6800000\xC2\xCF\x00\x00\x01I\xCC\x13\xE7\xE8\xCF\x00\x00\x01I\xCC\x13\xE7\xE8\xDC\x00\x17\xAFImose:Ankara S1\xA217\x92\xCB\x00\x00\x00\x00\x00\x00\x00\x00\xCB\x00\x00\x00\x00\x00\x00\x00\x00\xABarmeabi-v7a\xAB480854*240\xA2US\xA6MTN-NG\xA2en\x01\xADAndroid+4.4.2\x01\xA5HSDPA\xC2\xA562130\xA562130\xA0\xA0\x00\xDA\x00\x842|null|dd5ed285a49219c3|355656579263906|621300079003992|89234010002091165897|3de6db1210f7b83c5a32f449b700c05d2|null|0123456789ABCDEF\xDA\x00\xF1[{"type":"wifi","connected":false,"available":false},{"current":[{"basestationId":-1,"systemId":10984,"type":"HSDPA","mcc":"62130","operator":"MTN-NG","country":"ng","networkId":6986586}],"type":"cellular","connected":true,"available":true}]\xDA\x01-[{"extra1":"62130","extra2":"","extra4":"ng","displayName":"621300079003992","type":"sim","extra6":"355656579263898","name":"89234010002091165897"},{"extra1":"62160","extra2":"etisalat","extra4":"","displayName":"621600084059740","type":"sim","extra6":"355656579263906","name":"89234000088954147603"}]\x00\x00\x91\x92\x02\x97\xDA\x00$90b81cdf-1103-4a45-9b46-398774320873\xCF\x00\x00\x01T\xBD\xEDpH\x02\x00\x90\x91\x95\xAD__tx.sdk.send\xA0\x01\xCF\x00\x00\x01T\xBE:\xADV\x84\xA2to\xA9analytics\xA4code\xCB@i\x00\x00\x00\x00\x00\x00\xA7latency\xCB@\xA8\xE6\x00\x00\x00\x00\x00\xA4size\xCB@\x86\x00\x00\x00\x00\x00\x00\x01\xC0", :exception=>#<TypeError: can't convert String into Integer>, :level=>:warn}

{"host":"tdcv3.talkingdata.net","ip":"14.215.43.76","url":"/g/d","headers":{"host":"tdcv3.talkingdata.net","x-forwarded-for":"14.215.43.76","x-forwarded-proto":"http","connection":"keep-alive","x-forwarded-port":"80","content-length":"628"},"remote_addr":"172.31.24.190","ts":1463480497286}ء30803bf4f6100f2943ff35682aee8e446ؠ12EB5673BF3B3BFA4BE53CD8D59D545B¹com.lashou.groupurchasing¤7.13¨20150429́M범H²Android+TD+V1.2.61©wandoujiaÏM밁ŹM밁Zڗ±samsung:SM-N7506V¢18ʀ@³-ʀ6䕋f0«armeabi-v7a¬7201280320¢CN§GD CMCC¢z«Android+4.3£LTE¥46007¥46000٨22.954183,113.005296,,1463480495922,113.0,,,network:22.954183,113.005296,,1463480495922,113.0,,,network: ټ2|null|9f9f976465c9efdb|352204061537363|460077184011077|898600b2191507446330|30803bf4f6100f2943ff35682aee8e446|null|48e59b09۰[{"type":"wifi","connected":false,"available":false},{"current":[{"basestationId":0,"systemId":9828,"type":"LTE","mcc":"46000","operator":"GD CMCC","country":"cn","networkId":171629837}],"type":"cellular","connected":true,"available":true}] ؤ6574f8f8-acfd-4e95-bb2f-c63183efabeáOᅣែؤb8f4c97a-9659-4dba-b082-b07b30059496́TeSNᅠ{:exception=>#<NoMethodError: undefined method []=' for nil:NilClass>, :backtrace=>["/home/hadoop/td-logstash/vendor/bundle/jruby/1.9/gems/logstash-codec-msgpack-2.0.2-java/lib/logstash/codecs/msgpack.rb:30:indecode'", "/home/hadoop/td-logstash/vendor/bundle/jruby/1.9/gems/logstash-codec-msgpack-2.0.2-java/lib/logstash/codecs/msgpack.rb:24:in decode'", "/home/hadoop/td-logstash/vendor/bundle/jruby/1.9/gems/logstash-input-kafka-2.0.2/lib/logstash/inputs/kafka.rb:171:inqueue_event'", "/home/hadoop/td-logstash/vendor/bundle/jruby/1.9/gems/logstash-input-kafka-2.0.2/lib/logstash/inputs/kafka.rb:146:in run'", "/home/hadoop/td-logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.1.1-java/lib/logstash/pipeline.rb:206:ininputworker'", "/home/hadoop/td-logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.1.1-java/lib/logstash/pipeline.rb:199:in `start_input'"], :level=>:error}