Trying to create a simple, non-filtered pipeline, but Kibana shows the message field as blank
Pipeline conf is
input {
tcp {
port => 9600
codec => json
mode => server
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "myindex"
}
}
When starting Logstash the following message is displayed
[WARN ] 2020-08-10 16:33:02.698 [[main]>worker0] elasticsearch - Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"myindex", :routing=>nil, :_type=>"_doc"}, #<LogStash::Event:0xbba272>], :response=>{"index"=>{"_index"=>"myindex", "_type"=>"_doc", "_id"=>"oJs32XMBjyfMsZpZ2EjQ", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"object mapping for [message] tried to parse field [message] as object, but found a concrete value"}}}}
Other fields are being mapped, but I would like to see the full message as wel...
Thanks @Badger after applying the filter you suggested, the messages are being displayed on the doc.
However, every line of this message generates a new doc, instead of showing the full message in the same doc. Is there any way we can group these lines into a single message?
** Edit: These docs share a common field, i.e. execution.execid - Is there a way to aggregate all messages into the same doc, based on this ID?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.