Could not index event to Elasticsearch:cannot be changed from type [float] to [long]:

Hi Team,

I have an issue in creating index using my database table data through logstash JDBC.
The issue is.

[2018-07-06T12:37:24,587][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"bank_crisil_rated", :_type=>"doc", :_routing=>nil}, #<LogStash::Event:0x2aaa7c95>], :response=>{"index"=>{"_index"=>"bank_crisil_rated", "_type"=>"doc", "_id"=>"irdqbmQBoIEiWn9atfMq", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"mapper [capital_adequacy_ratio] cannot be changed from type [float] to [long]"}}}}


[2018-07-06T12:37:24,629][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"bank_crisil_rated", :_type=>"doc", :_routing=>nil}, #<LogStash::Event:0x10d76091>], :response=>{"index"=>{"_index"=>"bank_crisil_rated", "_type"=>"doc", "_id"=>"urdqbmQBoIEiWn9atfMr", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"mapper [total_income] cannot be changed from type [float] to [long]"}}}}

please Suggest me for solution.

1 Like

You need to either change the mapping or use a mutate filter to transform your number to a float.

If you don't know how to do it, feel free to ask in #logstash channel.

Hi @dadoonet

I have an issue in pushing data into elasticsearch using my database table through logstash JDBC,some row's data is missing and throws an Exception.
The Exception is is.

[2018-07-06T12:37:24,587][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"bank_crisil_rated", :_type=>"doc", :_routing=>nil}, #<LogStash::Event:0x2aaa7c95>], :response=>{"index"=>{"_index"=>"bank_crisil_rated", "_type"=>"doc", "_id"=>"irdqbmQBoIEiWn9atfMq", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"mapper [capital_adequacy_ratio] cannot be changed from type [float] to [long]"}}}}


[2018-07-06T12:37:24,629][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"bank_crisil_rated", :_type=>"doc", :_routing=>nil}, #<LogStash::Event:0x10d76091>], :response=>{"index"=>{"_index"=>"bank_crisil_rated", "_type"=>"doc", "_id"=>"urdqbmQBoIEiWn9atfMr", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"mapper [total_income] cannot be changed from type [float] to [long]"}}}}

but when we trying to push data again(second time) then all rows'data is pushed successfully in the Elasticsearch.

I swa one solution,using Mutate Filter but its not working.
can you Help me please.

HI @dadoonet,

can you provide solution for this problem?
I have already ask you multiple time but did't got any solution from elasticsearch team.
i have alreday told you that the mutate filter is not working.
how i can i slove this issue can you please give me an exact solution?

Here is the advice I gave you last time:

If you don't know how to do it, feel free to ask in logstash channel.

If you still want to ask here, you need to provide informations to help to diagnose such as:

  • a sample pipeline which reproduces your problem
  • a first correct input content
  • the input content that make failing your pipeline
  • optionally if any the mapping your defined
  • the mapping generated for your index
  • if exist, the index templates.

Hi @dadoonet,

1. A sample pipeline and log is:

    [2018-07-09T19:21:03,284][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"bank_crisil_rated_test_for_problm", :_type=>"doc", :_routing=>nil}, #<LogStash::Event:0x7dd752b4>], :response=>{"index"=>{"_index"=>"bank_crisil_rated_test_for_problm", "_type"=>"doc", "_id"=>"wMdPf2QBylXhGFh3VX8g", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"mapper [total_income] cannot be changed from type [float] to [long]"}}}}
    [2018-07-09T19:21:03,284][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"bank_crisil_rated_test_for_problm", :_type=>"doc", :_routing=>nil}, #<LogStash::Event:0x86af032>], :response=>{"index"=>{"_index"=>"bank_crisil_rated_test_for_problm", "_type"=>"doc", "_id"=>"HMdPf2QBylXhGFh3VYAn", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"mapper [capital_adequacy_ratio] cannot be changed from type [float] to [long]"}}}}
    [2018-07-09T19:21:03,287][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"bank_crisil_rated_test_for_problm", :_type=>"doc", :_routing=>nil}, #<LogStash::Event:0x3401b2af>], :response=>{"index"=>{"_index"=>"bank_crisil_rated_test_for_problm", "_type"=>"doc", "_id"=>"HsdPf2QBylXhGFh3VYAn", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"mapper [total_income] cannot be changed from type [float] to [long]"}}}}

2.
Actually I am using ELK 6.3.0 version. I am pushing data to ES using logstash JDBC.
I have one query which is fetching 188 records in Oracle.
But from logstash my 5-10 records getting missed. I am getting data type convert exception i.e. can not convert from float to long. This happens only first time while pushing data to ES and for only some records. After pushing 180 records to ES then second time when I am restart logstash at that time all records gets pushed to ES i.e. 188.

Note: I have some predefined table's data in my database which i want to push in the ES using logstash JDBC and try to create index.

Here is my logstash_Conf_file.

input {
    jdbc {
        jdbc_validate_connection => true
        jdbc_connection_string => "jdbc:oracle:thin:@AbDV:1931/CUPS"
        jdbc_user => "ABCTIX"
        jdbc_password => "ABC@2017"
        jdbc_driver_library => "/opt/READONLYREST/OSS/logstash-6.3.0/ojdbc7.jar"
        jdbc_driver_class => "Java::oracle.jdbc.driver.OracleDriver"

        statement => "Select Distinct Bm.Crmcompanyid,Cd.Company_Name,
  Bm.Datatype,Bm.Template,
  Bm.Isconsolidated,
  Bm.Periodendson,
  Bm.Crr_Date,
  bm.period_code,
Column1  TOTAL_INCOME  ,
Column10  TOTAL_OPERATING_EXPENDITURE ,
Column11  Total_Provisions_Cont ,
Column12  Adjusted_PROFIT_AFTER_TAX ,
Column13  Net_Intrst_Incm_Avg_Ttl_Assts ,
Column14  Non_Int_Income_Avg_Ttl_Assts  ,
Column15  Non_Int_expenses_Avg_Ttl_Assts  ,
Column16  PAT_Adjusted_Avg_Total_Assets ,
Column17  Intrst_Paid_on_Dep_Avg_Dep  ,
Column18  Tier_I_Capital_Percentage ,
Column19  Capital_Adequacy_Ratio  ,
Column2 TOTAL_ASSETS  ,
Column20  Gross_NPA_Loans_Advances  ,
Column21  Net_NPA_Loans_Advances  ,
Column22  Networth_Net_NPA  ,
Column23  CASA  ,
Column24  Operating_Expenses  ,
Column25  Equity_Share_Capital  ,
Column3 TOTAL_LOANS_ADVANCES  ,
Column4 TANGIBLE_NETWORTH ,
Column5 TOTAL_DEPOSITS  ,
Column6 TOTAL_INTEREST_INCOME ,
Column7 TOTAL_INTEREST_PAID ,
Column8 NET_INTEREST_INCOME ,
Column9 TOTAL_OTHER_INCOME  
From Banknbfc_Periodmaster_Synm Bm,
  Banknbfc_Perioddata_Synm Bd,
  company_details_mv_synm cd
Where Bm.Period_Code = Bd.Period_Code
And Cd.Company_Code = Bm.Crmcompanyid
and bm.template = 'Bank'
and cd.company_status = 'Active'
"       
 }
}
filter {
    mutate {
    convert => {"TOTAL_INCOME"=>"float" }
    convert => {"Capital_Adequacy_Ratio"=>"float" }          
  }
}
output   {
   
elasticsearch {
	hosts => "XXXXXXXX:123:176:9200"
    index => "bank_crisil_rated_test_for_problm"    
    user => "ajitb"
    password => "Abc123"
    ssl => true
    ssl_certificate_verification => false
    truststore => "/opt/READONLYREST/elasticsearch-6.3.0/config/keystore.jks"
    truststore_password => "readonlyrest"
}
}

please Help me,I am Suffering from this since last 3-4 days.
Need the help of ElasticSearch Team.

Change your mapping and makes those fields as long.

Hi,
We have to create index runtime using logstash. We have a table in database. In that we have columns that contains some decimal and some integer values. And we have to run logstash and index should be created runtime. We are able to create index but for few records logstash giving exception can not convert from float to long. But my columns contains both, some decimal and integer values. So we don't know that why It is giving exception for typecasting from float to long this is first thing. Sometime getting long to float exception also. We have used all datatypes in mutate. like

filter {
    mutate {
    convert => {"TOTAL_INCOME"=>"float" }
    convert => {"Capital_Adequacy_Ratio"=>"integer_eu" }
    
       
  }
}


filter {
    csv {
        source => "data"
        columns => ['TOTAL_INCOME', 'Capital_Adequacy_Ratio']
        separator => ";"
    }
    mutate {
        convert => {
            "TOTAL_INCOME" => "float"
            "Capital_Adequacy_Ratio" => "float_eu"
            
        }
    }
}

convert => [ "[TOTAL_INCOME]", "float" ]
convert => [ "[Capital_Adequacy_Ratio]", "float" ]



mutate {
    convert => {
      "Total_Income" => "integer"
    }
  }

mutate {
    convert => {
      "Capital_Adequacy_Ratio"=> "integer"
    }
  }

But we are getting same issue. Also we have tried long datatype also according to you but getting same exception. My database column values are integer and decimal both values.
Please provide solution on this issue.

I moved your question to #logstash where you can hopefully get a better help.

I'd encourage you providing information I asked for in Could not index event to Elasticsearch:cannot be changed from type [float] to [long]:

Hi @dadoonet,

I have raised the same query in Logstash Team but i did't got any reply from them.
i am still suffering Type Conversion issue and waiting for response of Logstash and Elasticsearch Team.
I have tried a number of concepts but did't got success.
I seems that without your help its almost impossible,Please help me to solve this issue.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.