Cannot be changed from type [long] to [float] exception in logstash jdbc plugin using

Hi all,
I am using jdbc plugin to load data from database and merge it with a an created index. my logstash configuration is as following:

input {
jdbc {
    jdbc_driver_library => "sqljdbc42.jar"
    jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
    jdbc_connection_string => "jdbc:sqlserver://localhost:3333;databaseName=Total"
    jdbc_user => "****"
    jdbc_password => "****"
	schedule => "30 8-23/1 * * *"
	last_run_metadata_path => "E:\logstashlog\.my_last_run"
    statement => "
	DECLARE @DDate1  CHAR(10)
   select @DDate1=REPLACE(MAX(FDate),'/','') from Total.dbo.Total_Control
   select [BaseDate_Accept_Setel],
      [Cycle_No] from dbo.vw_Total where (BaseDate_Accept_Setel <= @DDate1) and (BaseDate_Accept_Setel > :sql_last_value)"
use_column_value => "true"
tracking_column => "basedate_accept_setel"
}

}
filter {
mutate {
  add_field => {
   "basedate" => "%{basedate_accept_setel}"
   
  }
  convert => {
  "basedate" => "string"
  }
}
ruby {
            code => "
            event.set('Myear', event.get('basedate')[0..3])
            event.set('Mmonth', event.get('basedate')[4..5])
            event.set('Mday', event.get('basedate')[6..7])
            "
    }

	mutate {
	add_field => {
		"n_date" => "%{Myear}/%{Mmonth}/%{Mday}"
	}
	}
	
jdbc_streaming {
    jdbc_driver_library => "sqljdbc42.jar"
    jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
    jdbc_connection_string => "jdbc:sqlserver://localhost:3333;databaseName=Total"
    jdbc_user => "****"
    jdbc_password => "****"
    parameters => {"mine" => "n_date"}
    statement => "
	DECLARE @ddate2  CHAR(10)
	set @ddate2=:mine
SELECT Total_OnLine.dbo.s2md(@ddate2) as gdate"
target => "gdate"
}
mutate { replace => { "gdate" => "%{[gdate][0][gdate]}" } }
    mutate { gsub => [ "gdate", "/", "-" ] }
	mutate { gsub => [ "n_date", "/", "-" ] }

}

output {
  elasticsearch { 
    hosts => ["http://localhost:9200"]
    index => "txn_%{n_date}"
 }
  stdout { codec => rubydebug }
}

sometimes following message can be seen, in these cases, some rows of table didn't indexed in elasticsearch, for example if there are 100 rows in db table and there are 5 following warning, then in the created index there are 95 records. how can i handle this issue? many thanks.

[2019-03-13T13:30:09,474][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"txn_2019-03-13", :_type=>"doc", :routing=>nil}, #<LogStash::Event:0x39bf805a>], :response=>{"index"=>{"_index"=>"txn_2019-03-13", "_type"=>"doc", "_id"=>"_aB-dmkBtG7rmOFmq_lo", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"mapper [amount] cannot be changed from type [long] to [float]"}}}}

This is a warning from Elasticsearch. Your index mapping for the amount field expects a float but occasionally the datatype of amount is long (not float).
Use mutate/convert to ensure that the amount field's value is float.

many thanks

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.