Getting mapper_parsing_exception error and field name cannot be an empty string in logstash

Hi,
We are trying to onboard new logs into elasticsearch but logs are not indexing into elasticsearch.Logs are apperar in Logstash deadletter queue with "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"field name cannot be an empty string"}}.
logstash config is
filter
{
grok
{
break_on_match => true
match => {
"message" =>
[
"%{TIMESTAMP_ISO8601:timestamp} [%{LOGLEVEL:[@metadata][loglevel]}\s*] - %{GREEDYDATA:jsonmessage}",
"%{GREEDYDATA:jsonmessage}"
]
}

            }
                    json
                    {
                            source => "jsonmessage"
                            target => "[x][data]"
                            tag_on_failure => ["_jsonparsefailure"]
                            skip_on_invalid_json =>true
                    }
					mutate{
					renaming few fileds
					}

}

When i did file out in logstash pipeline, In json message PayMent filed is available but that PayMent filed is not created as a separate but only value we have that field value ["":"PayMent-00000000002345"].Is it due to any issue with json parser? Or did i miss anything here?
Could you please guide me on this?

Welcome to the forum!

That's likely (part) of your problem!

We can try, but you will need to share a couple of sample documents (obfuscate sensitive/private information without changing the structure please), and your logstash configuration.

Hi Kevin,
Thanks for responding.
Sample log:-
2025-07-23 05:23:23.846 [INFO] - {"logtype":"abclog","request":{"StreamName":"nonprod-stream","operation":"xyz","sequenceID":"xxxxxxx","type":"ABCDE"},"response":{"SequenceRole":"234446666666666677777777777778888888","PayMent":"PayMent-00000000002345"}}

Logstash config:-
filter
{
grok
{
break_on_match => true
match => {
"message" =>
[
"%{TIMESTAMP_ISO8601:timestamp} [%{LOGLEVEL:[@metadata][loglevel]}\s*] - %{GREEDYDATA:jsonmessage}",
"%{GREEDYDATA:jsonmessage}"
]
}

            }
                    json
                    {
                            source => "jsonmessage"
                            target => "[x][data]"
                            tag_on_failure => ["_jsonparsefailure"]
                            skip_on_invalid_json =>true
                    }
					mutate{
					renaming few fileds
					}

}

When i did file out i am getting below one. In json message PayMent filed is available but after applying json parser the filed name shows empty.
{"x":{"data":{"response":{},"request":{}}},"jsonmessage":"{"logtype":"abclog","request":{"StreamName":"nonprod-stream","operation":"xyz","trackingID":"xxxxxxxx","type":"ABCDE"},"response":{"SequenceRole":"234446666666666677777777777778888888","PayMent":"PayMent-00000000002345"}} ","component":"data-abc","environment":"NONPROD","timestamp":"2025-07-23 10:28:45.279","":"PayMent-00000000002345","logstashhostname":"logstash-abc-0","logtype":"abclog","loglevel":"INFO"}

Sorry, I’m not near my computer so can’t test anything, but your data doesn’t quite match the grok pattern, which btw has jsonmessage twice?

Is it always going to include request:{somestuff} followed by response:{otherstuff}?