I have a field called bytes_response
that can take two types of values:
- integer
- -
Here's the mapping in ES (version 5.6.8):
"bytes_response": {
"type": "integer",
"ignore_malformed": true
}
Still I get the following error:
org.elasticsearch.index.mapper.MapperParsingException: failed to parse [bytes_response]
...
Caused by: java.lang.NumberFormatException: For input string: "-"
What's interesting is that when I used Filebeat to read the files and feed the events to Logstash via the beat input plugin, I had no such issues.
Current Logstash config uses the file input plugin since the files are local. Here's the config:
input {
file {
path => "/somepath/**/*.log"
sincedb_path => "/somepath/sincedb"
start_position => "beginning"
}
}
filter {
fingerprint {
method => "SHA256"
key => "somekey"
}
grok {
patterns_dir => ["somepath"]
match => [
"message" , "%{PREREQUEST} \"%{REQUEST}\" %{POSTREQUEST}"
]
remove_field => ["message"]
}
if [request_path] and [http_referer] =~ "somestring" {
date {
match => [ "timestamp", "dd/MMM/yyyy:HH:mm:ss Z" ]
remove_field => ["timestamp"]
}
geoip {
source => "client_ip"
tag_on_failure => "internal_ip"
}
} else {
drop { }
}
}
output {
elasticsearch {
hosts => [ "localhost:9200" ]
index => "someindex_%{+YYYY.MM}"
document_id => "%{fingerprint}"
action => "create"
}
}
This is the Grok filter part that produces the field bytes_response:
(?<bytes_response>%{INT}|-)