Hello, I am running Logstash with 11 pipelines for 9 different log types, when I send those logs from Filebeat to Logstash, I tail the Logstash logs and get this error. Nonetheless, in Elasticsearch and Kibana I receive well-structured logs continuously. How can I get rid of this error?
[2021-09-15T13:09:39,436][WARN ][logstash.outputs.elasticsearch][cppdevices][62492695d933728f438151c7ae27d1c22f6d509536c592174d0b10ab262b3e15]
Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>"69399857", :_index=>"soleil", :routing=>nil}, {"ndc"=>{},
"device"=>{"domain"=>"i06-m-cx", "member"=>"falconx1.1", "family"=>"dt", "name"=>"i06-m-cx/dt/falconx1.1"}, "@timestamp"=>2021-03-26T07:36:50.336Z,
"@version"=>"1", "host"=>{"name"=>"localhost.localdomain"}, "thread"=>"5680", "input"=>{"type"=>"log"}, "log"=>{"flags"=>["multiline"], "offset"=>72678,
"file"=>{"path"=>"/vagrant/assembly/ds_RecordingManager/cppdevices/cpplogs-20210326-22h46/i06-m-cx_dt_falconx1.1.log.1"}},
"agent"=>{"ephemeral_id"=>"3c90d315-7268-46bc-9ee2-05c1c3a8b1ee", "id"=>"5f61175d-e407-49d3-b9a3-3efa77d25554", "hostname"=>"localhost.localdomain",
"type"=>"filebeat", "version"=>"7.14.0", "name"=>"localhost.localdomain"}, "ecs"=>{"version"=>"1.10.0"}, "event"=>{}, "tags"=>["cppdevices",
"beats_input_codec_plain_applied"], "message"=>{}, "level"=>"INFO"}], :response=>{"index"=>{"_index"=>"soleil", "_type"=>"_doc", "_id"=>"69399857",
"status"=>400, "error"=>{"type"=>"mapper_parsing_exception",
"reason"=>"failed to parse field [message] of type [text] in document with id '69399857'. Preview of field's value: '{}'",
"caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on
a START_OBJECT at 1:723"}}}}}