Hi all,
The setup is:
Firewall --> Filebeat --> Logstash --> Elasticsearch
The following error keeps appearing in /var/log/logstash/logstash-plain.log
[2023-03-23T18:03:53,651][WARN ][logstash.outputs.elasticsearch][main][96d3f1a45a0ecad7c0b459780eeece72a47c0655a8a6ffc6c49e0f8255fc9ec6] Could not index event to Elasticsearch. status: 400, action: ["index", {:_id=>nil, :_index=>"firewall", :routing=>nil}, {"reason"=>"file-size", "action"=>"roll-log", "level"=>"notice", "type"=>"event", "@timestamp"=>2023-03-23T12:33:44.000Z, "logdesc"=>"Disk log rolled", "msg"=>"\"Disk", "eventtime"=>"1679574823731437365", "devname"=>"FORTIGATE", "subtype"=>"system", "log"=>"tlog", **"service"=>{"type"=>"fortinet"}**}], response: {"index"=>{"_index"=>"firewall-2023.03.23", "_id"=>"Q1d2DocBJN2GzK3gGZMP", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [service_] of type [text] in document with id 'Q1d2DocBJN2GzK3gGZMP'. **Preview of field's value: '{type=fortinet}**'", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:277"}}}}
The filters in the logstash conf file are:
filter
{
grok
{
match => {"message" => "%{SYSLOG5424PRI}%{GREEDYDATA:message}"}
overwrite => [ "message" ]
}
mutate
{
remove_field => ["@timestamp","agent","input","event","fileset","tags","ecs","log","source","@version"]
}
kv
{
field_split => " "
}
mutate
{
remove_field => ["message"]
add_field => { "logdate" => "%{date} %{time}" }
}
date
{
match => [ "logdate", "yyyy-MM-dd HH:mm:ss" ]
timezone => "Asia/Kolkata"
target => "@timestamp"
}
}
The field service => "{"type"=>"fortinet"}" appears to be the reason for the error, i am unable identify the source of this field with this particular value in the logs. This is because the firewall's logs reference does not have mention of such a field with this particular value. The raw log from the firewall doesn't have it either.
How can this be resolved?