Hi,
I have json data which is harvested by filebeat and shipped to logstash.
Filebeat encapsulating the orginal json message in message field by default. I didn't change it.
What I want to do (in logstash):
- parse json string stored in message
 - for the case that parsing fails, I want to keep the original input message for debugging purpose.
 - if parsing succeeds, the original message should be dropped.
 - it is possible that the original application log json contains a field named message. This must effect the message field created by filebeat.
 
My implementation idea looks like the following:
filter
{
	# remember logType
	mutate
	{
		# If included json may contain a field also called message, an array would be created
		# We don't want arrays, that's why we rename it.
		rename => ["message", "[@metadata][mymessage]" ]
	}
	# parse the json (should contain logType)
	json
	{
		id => "json"
		source => "[@metadata][mymessage]"
	}
	# check if we had parsing errors. If we had any errors, keep the original input for debugging, otherise delete it
	if "_jsonparsefailure" in [tags]
	{
		mutate
		{
			id => "keep_original_message"
			# if logstash was not able to parse the json, kepp original message for debugging purpose. Otherwise get rid of it
			rename => [ "[@metadata][mymessage]", "[logstash][debug][originalMessage]" ]
		}
	}
}
It works fine if I get a valid json. But if I provoke a _jsonparsefailure I the field logstash.debug.originalMessage is empty and I have following error in logstash's log:
[2019-11-15T08:21:55,162][WARN ][logstash.filters.json    ][generic_json] Parsed JSON object/hash requires a target configuration option {:source=>"[@metadata][mymessage]", :raw=>"\"das ist ein parsing fehler2\" "}
If I use a normal field instead of @metadata it works like charm. It gets renamed and the content stays and I remove the field if no parsing error is tagged.
But I want to understand why these behaviors are different 
Thanks, Andreas