Logstash port data from Mysql, cannot handle tinyint type field defined from Mysql

I defined the field as integer in elasticsearch type template, here is the screenshot of the definition in kibana

I use logstash to port data from mysql to elasticsearch, when I turn on the pipeline, this response is repeated for each message I ingest.

2:15:47.934 [[main]>worker0] WARN logstash.outputs.elasticsearch - Failed action. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"leads_2017_02", :_type=>"log", :_routing=>nil}, 2017-03-02T22:15:43.916Z %{host} %{message}], :response=>{"index"=>{"_index"=>"leads_2017_02", "_type"=>"log", "_id"=>"AVqRF9pmPvi4HyPItega", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [record_impression]", "caused_by"=>{"type"=>"json_parse_exception", "reason"=>"Current token (VALUE_FALSE) not numeric, can not use numeric value accessors\n at [Source: org.elasticsearch.common.bytes.BytesReference$MarkSupportingStreamInputWrapper@7464568c; line: 1, column: 509]"}}}}

Problem solved. Looks like logstash does not recognize tinyint from mysql, so I converted the field to char type in mysql query, and then in logstash filter, use mutation to convert it to integer. It works.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.