Date filter results in empty documents and no error in logs

Hello,

I am trying to apply the date filter against data that has been created by the json filter, the relevant field gets placed into [data][timestamp] in text format, when the following filter is applied, logstash keeps creating empty documents. No errors appear in the log. What could be causing this?

Thank you all in advance,

Below are examples and screenshots.

data by json filter:

"data"."timestamp": "1716886015302"

datatimestamp

filter snippet:

      date {
           match => ["[data][timestamp]", "UNIX_MS"]
       }

Here is a screenshot of the resulting empty document:

You need to share your entire logstash pipeline and also any logstash logs with errors or warnings.

This filter alone would not cause this issue as it would simple parse the date and store it in the @timestamp field.

Here is the whole config, once the date filter is added the the behaviour can be observed, which is empty documents instead of the expected content with the date matched. There are no errors in the log at all.

input {
      s3 {
        bucket => "aws-s3-bucket"
        region => "aws-region"
	}
}
filter {
       mutate {
           remove_field => ["[event][original]"]
       }
       json {
           source => "message"
           ecs_compatibility => "disabled"
           target => "data"
       }
	   if [data][webaclId] {
           grok {
               match => { "[data][webaclId]" => "%{WORD}:%{WORD}:%{WORD:waf_version}:%{DATA:aws_region}:%{NUMBER}:%{WORD}/%{WORD}/%{DATA:webacl_name}/%{GREEDYDATA}" }
               named_captures_only => true
           }
       }
	   mutate {
	       convert => {
		       "[data][timestamp]" => "string"
		   }
	   }
	   date {
           match => ["[data][timestamp]", "UNIX_MS"]
       }
}
output {
    elasticsearch {
        hosts => ["host1", "host2"]
        index => "index-%{+YYYY.MM.dd}"
    }
}


OK, as it turns out, it does work, the empty documents that show up are because of invalid json coming from the source. I did not realise it was processing older data and did not look far enough into the past to see the index being populated with data going back days.