Value out of range

using metric beat 6.2.1 and logstash 6.2.2 I am getting the following error. I this a metricbeat bug? It seems that the value is too large to fit into a long in ES?

[2018-02-23T23:09:43,469][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"metricbeat-am2-2018.02", :_type=>"doc", :_routing=>nil}, #LogStash::Event:0x69540279], :response=>{"index"=>{"_index"=>"metricbeat-am2-2018.02", "_type"=>"doc", "_id"=>"dBTtxGEBNoUJgeSF8CCY", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [system.process.cgroup.memory.kmem_tcp.limit.bytes]", "caused_by"=>{"type"=>"i_o_exception", "reason"=>"Numeric value (18446744073709551615) out of range of long (-9223372036854775808 - 9223372036854775807)\n at [Source: org.elasticsearch.common.bytes.BytesReference$MarkSupportingStreamInputWrapper@6247f9a3; line: 1, column: 341]"}}}}}
[2018-02-23T23:09:43,469][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"metricbeat-am2-2018.02", :_type=>"doc", :_routing=>nil}, #LogStash::Event:0x41d342de], :response=>{"index"=>{"_index"=>"metricbeat-am2-2018.02", "_type"=>"doc", "_id"=>"dRTtxGEBNoUJgeSF8CCY", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [system.process.cgroup.memory.kmem_tcp.limit.bytes]", "caused_by"=>{"type"=>"i_o_exception", "reason"=>"Numeric value (18446744073709551615) out of range of long (-9223372036854775808 - 9223372036854775807)\n at [Source: org.elasticsearch.common.bytes.BytesReference$MarkSupportingStreamInputWrapper@6fafce7a; line: 1, column: 341]"}}}}}
[2018-02-23T23:09:43,469][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"metricbeat-am2-2018.02", :_type=>"doc", :_routing=>nil}, #LogStash::Event:0xbb8ae0b], :response=>{"index"=>{"_index"=>"metricbeat-am2-2018.02", "_type"=>"doc", "_id"=>"dxTtxGEBNoUJgeSF8CCY", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [system.process.cgroup.memory.kmem_tcp.limit.bytes]", "caused_by"=>{"type"=>"i_o_exception", "reason"=>"Numeric value (18446744073709551615) out of range of long (-9223372036854775808 - 9223372036854775807)\n at [Source: org.elasticsearch.common.bytes.BytesReference$MarkSupportingStreamInputWrapper@60aceb3b; line: 1, column: 290]"}}}}}
[2018-02-23T23:09:43,469][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"metricbeat-am2-2018.02", :_type=>"doc", :_routing=>nil}, #LogStash::Event:0xc041462], :response=>{"index"=>{"_index"=>"metricbeat-am2-2018.02", "_type"=>"doc", "_id"=>"eBTtxGEBNoUJgeSF8CCY", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [system.process.cgroup.memory.kmem_tcp.limit.bytes]", "caused_by"=>{"type"=>"i_o_exception", "reason"=>"Numeric value (18446744073709551615) out of range of long (-9223372036854775808 - 9223372036854775807)\n at [Source: org.elasticsearch.common.bytes.BytesReference$MarkSupportingStreamInputWrapper@48d5c88b; line: 1, column: 341]"}}}}}
[2018-02-23T23:09:44,184][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"metricbeat-am2-2018.02", :_type=>"doc", :_routing=>nil}, #LogStash::Event:0x3e49a150], :response=>{"index"=>{"_index"=>"metricbeat-am2-2018.02", "_type"

That's exactly the case.

You can workaround the issue by dropping the field. I think there's an example in the ticket: Missing cgroup fields on hardened Debian 8 machines leads to errors · Issue #5854 · elastic/beats · GitHub

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.