Can not convert from long to float logstash

I'm taking data from sql server to elasticsearch 6.6 using logstash and jdbc. Some of my records are skipped by logstash printing these warning logs:

[2019-02-15T16:48:52,630][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>"2111348540", :_index=>"order_tracking1", :_type=>"orderstracking_report", :_routing=>nil}, #LogStash::Event:0x14b8bcd8], :response=>{"index"=>{"_index"=>"order_tracking1", "_type"=>"orderstracking_report", "_id"=>"2111348540", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"mapper [forecast_voice_rgu] cannot be changed from type [long] to [float]"}}}}

....[2019-02-15T16:48:52,867][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>"13625215", :_index=>"order_tracking1", :_type=>"orderstracking_report", :_routing=>nil}, #LogStash::Event:0x70e6f34c], :response=>{"index"=>{"_index"=>"order_tracking1", "_type"=>"orderstracking_report", "_id"=>"13625215", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"mapper [forecast_total_rgu] cannot be changed from type [long] to [float]"}}}}

........[2019-02-15T16:48:52,872][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>"190215307182245", :_index=>"order_tracking1", :_type=>"orderstracking_report", :_routing=>nil}, #LogStash::Event:0x729a53dd], :response=>{"index"=>{"_index"=>"order_tracking1", "_type"=>"orderstracking_report", "_id"=>"190215307182245", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"mapper [forecast_total_rgu] cannot be changed from type [long] to [float]"}}}}

My 3 records are missing due to these errors. While the fields [forecast_total_rgu],[forecast_voice_rgu] are not long rather floats in my database having values upto 2 decimals. I don't know why logstash is skipping these records printing these logs while logstash doesn't support long datatype.

I just created an other test index and re-imported the same data. For this time it skipped 4 other record and imported previous records fairly. I don't understand this random behaviour of logstash.

If anybody can help.

This has little to do with Logstash itself, it's ElasticSearch that's actually rejecting those records. You just see them in your logs because they are propagated back by ElasticSearch.

The reason they are rejected has to do with how each field is mapped during index creation.
Unless you define an index mapping/template for ElasticSearch to know what data type each specific field is (and also make sure all values are of the same type for that field), it decides that itself based on the first value's type it sees.

You can use Logstash's mutate filter to convert all values in the same type to ensure you avoid those problems in the future.

I have already tried but of no use. I've imported data using .csv and it just works fine for me. The issue is only with SQL. The problem you explained should also occur in .csv but it does not.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.