MapperParsingException: failed to parse

I have been getting below error in my elasticsearch cluster. I am using ES 1.4.4 and logstash 1.4.2. Could anyone help in understanding the issue? The logstash grok parser is working fine and extracting the key:value pair but looks like ES is having difficluty in parsing the value

failed to execute bulk item (index) index {[logstash-syslog-2015.07.22][syslog][AU61oQE3j3S3q-QDZeYQ], source[{"message":"10.12.63.146 - 22/Jul/2015:07:58:23 -0400 baam.arcesium.com GET "/munshi/geneva-web/gpl/ext/status-monitor??now=1&=1437566303947" HTTP/1.1 302 160 "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/43.0.2357.134 Safari/537.36" 0.000 -","@version":"1","@timestamp":"2015-07-22T11:58:23.000Z","type":"syslog","host":"10.12.63.192","priority":150,"timestamp":["Jul 22 07:58:23","22/Jul/2015:07:58:23 -0400"],"logsource":"app2.x.ia55.net","program":"nginx","severity":6,"facility":18,"facility_label":"local2","severity_label":"Informational","clientIP":"10.12.63.146","user":"-","servername":"baam.arcesium.com","request":"GET","page":"/munshi/geneva-web/gpl/ext/status-monitor??now=1&=1437566303947","protocol":"HTTP/1.1","status":302,"bytes":160,"useragent":"Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/43.0.2357.134 Safari/537.36"","request_time":0,"response_time":"-"}]}
org.elasticsearch.index.mapper.MapperParsingException: failed to parse [response_time]
at org.elasticsearch.index.mapper.core.AbstractFieldMapper.parse(AbstractFieldMapper.java:416)
at org.elasticsearch.index.mapper.object.ObjectMapper.serializeValue(ObjectMapper.java:709)
at org.elasticsearch.index.mapper.object.ObjectMapper.parse(ObjectMapper.java:500)
at org.elasticsearch.index.mapper.DocumentMapper.parse(DocumentMapper.java:542)
at org.elasticsearch.index.mapper.DocumentMapper.parse(DocumentMapper.java:491)
at org.elasticsearch.index.shard.service.InternalIndexShard.prepareCreate(InternalIndexShard.java:392)
at org.elasticsearch.action.bulk.TransportShardBulkAction.shardIndexOperation(TransportShardBulkAction.java:444)
at org.elasticsearch.action.bulk.TransportShardBulkAction.shardOperationOnPrimary(TransportShardBulkAction.java:150)
at org.elasticsearch.action.support.replication.TransportShardReplicationOperationAction$AsyncShardOperationAction.performOnPrimary(TransportShardReplicationOperationAction.java:512)
at org.elasticsearch.action.support.replication.TransportShardReplicationOperationAction$AsyncShardOperationAction$1.run(TransportShardReplicationOperationAction.java:419)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NumberFormatException: For input string: "-"
at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
at java.lang.Long.parseLong(Long.java:581)
at java.lang.Long.parseLong(Long.java:631)
at org.elasticsearch.common.xcontent.support.AbstractXContentParser.longValue(AbstractXContentParser.java:145)
at org.elasticsearch.index.mapper.core.LongFieldMapper.innerParseCreateField(LongFieldMapper.java:300)
at org.elasticsearch.index.mapper.core.NumberFieldMapper.parseCreateField(NumberFieldMapper.java:235)
at org.elasticsearch.index.mapper.core.AbstractFieldMapper.parse(AbstractFieldMapper.java:406)

It looks like response_time is mapped to a number field in your mappings but you are passing in the string "-" in this record which cannot be converted to a number, hence the NumberFormatException. There are a number of options here:

  • Change the mapping to include "ignore_malformed": true for the response_time field. This will mean that the field is ignored and not indexed if it cannot be converted to a number
  • Change your Logstash config file to deal with non-number values in the filters section.
2 Likes