I am using nxlog to pump Windows EventLogs to Logstash. It's filling my logs with warnings, but I can't work out why it's even complaining. It looks like it's a field I'm not even using.
Part of the problem is that there are about 100 events bundled in one 'push' it seems, but the main part of the log looks like this:
:response => {
"create" => {
"_index" => "logstash-2015.11.10",
"_type" => "eventlog",
"_id" => "AVDyMq46b4HC4eCnWueF",
"status" => 400,
"error" => {
"type" => "mapper_parsing_exception",
"reason" => "failed to parse [Opcode]",
"caused_by" => {
"type" => "number_format_exception",
"reason" => "For input string:\"Info\""
}
}
}
},
:level=>:warn
Now the odd thing is that whilst there is an Opcode field in my data, for example "Opcode"=>"Info"
, I don't have anything that should be trying to parse that into a number.
Here's my logstash input.conf:
input {
tcp {
type => "eventlog"
port => 5002
codec => "json"
}
}
filter {
if [type] == "eventlog" {
mutate {
lowercase => [ "EventType", "Hostname", "Severity" ]
}
mutate {
rename => [ "Hostname", "source_host" ]
}
date {
match => [ "EventTime", "YYYY-MM-dd HH:mm:ss" ]
}
mutate {
rename => [ "Severity", "severity" ]
rename => [ "SeverityValue", "severity_code" ]
rename => [ "Channel", "channel" ]
rename => [ "SourceName", "program" ]
rename => [ "SourceModuleName", "nxlog_input" ]
rename => [ "Category", "category" ]
rename => [ "EventID", "event_id" ]
rename => [ "ProcessID", "pid" ]
}
if [source_host] =~ ".*\.domain\.local" {
mutate {
gsub => [
"source_host", "\.domain\.local", ""
]
}
}
mutate {
remove_field => [ "RecordNumber", "EventTime", "EventReceivedTime"]
}
}
}
output {
if [type] == "eventlog" {
elasticsearch {
hosts => [ "10.10.0.11:9200" ]
index => "logstash-eventlog-%{+YYYY.MM.dd}"
}
}
}
If I look at the elastic index mappings, the field is set as
"Opcode": {
"norms": {
"enabled": false
},
"type": "string",
"fields": {
"raw": {
"ignore_above": 256,
"index": "not_analyzed",
"type": "string"
}
}
},