Hello community -
i've set up a logstash config to parse some VmWare Host Logs, but i see an error in my Logstash logs:
error:
:response=>{"index"=>{"_index"=>"vmware-2020.05.14", "_type"=>"_doc", "_id"=>"ejurEnIBcPJpTta85B7J", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [host] of type [text] in document with id 'ejurEnIBcPJpTta85B7J'. Preview of field's value: '{name=MTXSWSQL01.matrix.int}'", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:871"}}}}}
Here's my Config for these kind of logs:
input {
udp {
port => 5140
type => syslog
}
}
filter {
if [message] =~ /-esx-/ {
if [message] =~ /^\S+ \S+ \S+ \S+: \S+ \S+ \[Originator@\d+ [^]]+\] .*$/
{
grok {
match => [ "message", "\S+ \S+ (?<syslog_hostname>\S+) (?<esxservice>\S+): (?<level>\S+) (?<esxprocess>\S+) \[Originator@\d+ (?<esxsubinfo>[^]]+)\] (?<esxmessage>.*)" ]
}
kv {
source=>"esxsubinfo"
}
}
else {
grok {
match => [ "message", "^\S+ \S+ (?<syslog_hostname>\S+) (?<esxservice>\S+): (?<esxservicemessage>.*)$" ]
}
}
}
}
output {
elasticsearch {
hosts => ["X.X.X.X:9200"]
#index => "vmware"
index => "vmware-%{+YYYY.MM.dd}"
user =>
password =>
cacert => '/etc/logstash/certs/ca.crt'
ssl_certificate_verification => false
}
}
What do i have to fix? I already rechecked the old config - but cannot spot my error.
Thanks in advance!