Broken Logstash -> Elasticsearch indexing after upgrade to 6.4 from 5.4

I have had a 5.x ELK server running for a long time with a bunch of grok pattern stuff taking syslog input from our network devices (Cisco route/switch/FW) to Logstash and outputting it to Elasticsearch. Now, I decided to go ahead and upgrade to 6.4... after reading through all the change info for 6.4 and considerations I didn't see anything that stood out as a "major issue" and went on with the upgrade.

For the most part everything went smooth and all the services seemed to start just fine. However, after logging in to the Kibana interface I had no logs... Then the real trouble came when I looked at the Logstash log file. I guess one of the "little" changes in the change log COMPLETELY BROKE EVERYTHING! It was all just presented in the oh well, theses are just some little things we changed nothing major manor and not a big red hey this is going to kill everything type of warning.

Now... how to fix what broke? I'm sorry if this is a simple fix (or dumb question), but I built this server almost two years ago with little modification since then, and everything has been working fine until today. It took almost two months to dig through forums and google to figure out the build process and modifications, so I don't remember everything that I did....

Here is the log issue for just about every Logstash event sent to Elasticsearch.

[2018-09-07T11:50:47,933][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash-cisco-fw-edge-2018.09.07", :_type=>"doc", :_routing=>nil}, #<LogStash::Event:0x4f260c5>], :response=>{"index"=>{"_index"=>"logstash-cisco-fw-edge-2018.09.07", "_type"=>"doc", "_id"=>nil, "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"Failed to parse mapping [_default_]: No field type matched on [float], possible values are [object, string, long, double, boolean, date, binary]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"No field type matched on [float], possible values are [object, string, long, double, boolean, date, binary]"}}}}}
[2018-09-07T11:50:47,933][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash-cisco-fw-edge-2018.09.07", :_type=>"doc", :_routing=>nil}, #<LogStash::Event:0x493be082>], :response=>{"index"=>{"_index"=>"logstash-cisco-fw-edge-2018.09.07", "_type"=>"doc", "_id"=>nil, "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"Failed to parse mapping [_default_]: No field type matched on [float], possible values are [object, string, long, double, boolean, date, binary]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"No field type matched on [float], possible values are [object, string, long, double, boolean, date, binary]"}}}}}
[2018-09-07T11:50:47,933][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash-cisco-fw-edge-2018.09.07", :_type=>"doc", :_routing=>nil}, #<LogStash::Event:0x6ad047cc>], :response=>{"index"=>{"_index"=>"logstash-cisco-fw-edge-2018.09.07", "_type"=>"doc", "_id"=>nil, "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"Failed to parse mapping [_default_]: No field type matched on [float], possible values are [object, string, long, double, boolean, date, binary]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"No field type matched on [float], possible values are [object, string, long, double, boolean, date, binary]"}}}}}
[2018-09-07T11:50:47,933][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash-cisco-fw-edge-2018.09.07", :_type=>"doc", :_routing=>nil}, #<LogStash::Event:0x5e3e11cc>], :response=>{"index"=>{"_index"=>"logstash-cisco-fw-edge-2018.09.07", "_type"=>"doc", "_id"=>nil, "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"Failed to parse mapping [_default_]: No field type matched on [float], possible values are [object, string, long, double, boolean, date, binary]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"No field type matched on [float], possible values are [object, string, long, double, boolean, date, binary]"}}}}}
[2018-09-07T11:50:47,933][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash-cisco-fw-edge-2018.09.07", :_type=>"doc", :_routing=>nil}, #<LogStash::Event:0x888955a>], :response=>{"index"=>{"_index"=>"logstash-cisco-fw-edge-2018.09.07", "_type"=>"doc", "_id"=>nil, "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"Failed to parse mapping [_default_]: No field type matched on [float], possible values are [object, string, long, double, boolean, date, binary]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"No field type matched on [float], possible values are [object, string, long, double, boolean, date, binary]"}}}}}

Originally I had to delete all the indices because they were in some old format that was never upgraded in 5.x (though I think this should have been automatic when I upgraded to 5.x, but wasn't). But now I get the above errors and have no idea where the process is broken.

Snip of the output config from Logstash:

  # Pass firewall logs to unique indices for organization
  if "cisco-fw" in [type] {
    if "edge-fw" in [tags] {
      elasticsearch {
      hosts => ["localhost:9200"]
      index => "logstash-cisco-fw-edge-%{+YYYY.MM.dd}"
      }
    }

I have no idea where to start looking for the broken string/field type, nor how to fix it, and any help would be great.

Well, I finally found all the automatic template info. I was able to update the templates in Elasticsearch to change 'float' to 'string' and then things started working. The query curl -X GET "localhost:9200/_template/*?pretty" helped me get all the templates and narrow it down to my specific one by searching for 'float'.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.