Hi,
I am using ELK 6.3.0 version. I am pushing data to ES using logstash JDBC.
I have one query which is fetching 188 records in Oracle.
But from logstash my 8 records getting missed. I am getting data type convert exception i.e. can not convert from float to long. This happens only first time while pushing data to ES and for only 8 records. After pushing 180 records to ES then second time I am restarting logstash at that time all records gets pushed to ES i.e. 188.
please provide solution on this issue. I want all records to be pushed in ES at first time also without any exception.
Below is my logstash logs.
[2018-07-06T12:37:24,629][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {tatus=>400, :action=>["index", {:_id=>nil, :_index=>"bank_crisil_rated", :_type=>"doc", :_routing=>nil}, #LogStash::Event:0x10d76091], :response=>{"index"=>{"_index"=>"bank_crisil_rated", "_type"=>"doc", "_id"=>"urdqbmQBoIEiWn9atfMr", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"mapper [total_income] cannot be changed from type [float] to [long]"}}}}
Note: my database columns contains decimal values. But only few columns giving exception. Here my column name is "total_income" in logs. In database table it contains decimal values.