Hi,
We are trying to onboard new logs into elasticsearch but logs are not indexing into elasticsearch.Logs are apperar in Logstash deadletter queue with "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"field name cannot be an empty string"}}.
logstash config is
filter
{
grok
{
break_on_match => true
match => {
"message" =>
[
"%{TIMESTAMP_ISO8601:timestamp} [%{LOGLEVEL:[@metadata][loglevel]}\s*] - %{GREEDYDATA:jsonmessage}",
"%{GREEDYDATA:jsonmessage}"
]
}
When i did file out in logstash pipeline, In json message PayMent filed is available but that PayMent filed is not created as a separate but only value we have that field value ["":"PayMent-00000000002345"].Is it due to any issue with json parser? Or did i miss anything here?
Could you please guide me on this?
We can try, but you will need to share a couple of sample documents (obfuscate sensitive/private information without changing the structure please), and your logstash configuration.
When i did file out i am getting below one. In json message PayMent filed is available but after applying json parser the filed name shows empty.
{"x":{"data":{"response":{},"request":{}}},"jsonmessage":"{"logtype":"abclog","request":{"StreamName":"nonprod-stream","operation":"xyz","trackingID":"xxxxxxxx","type":"ABCDE"},"response":{"SequenceRole":"234446666666666677777777777778888888","PayMent":"PayMent-00000000002345"}} ","component":"data-abc","environment":"NONPROD","timestamp":"2025-07-23 10:28:45.279","":"PayMent-00000000002345","logstashhostname":"logstash-abc-0","logtype":"abclog","loglevel":"INFO"}
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.