Can't parse Unix timestamp with logstash. Getting _dateparsefailure

Hi,
Our log files are in json format. Here is a small snippet of the file

    {
	"timestamp": 1608191939682,
	"formatVersion": 1,
	"webaclId": "69c2a78d-d849-4dca-bccd-xxx",
	"terminatingRuleId": "Default_Action",
	"terminatingRuleType": "REGULAR",
	"action": "ALLOW",
	"terminatingRuleMatchDetails": [],
	"httpSourceName": "ALB",
	"httpSourceId": "560cxxxx-app/ALB-WAF-2/0bf0bd24bxxx",
         .....

Our relevant portion of conf file looks like this

         input {
               s3 {
                           bucket => "bucket"
                           region => "us-east-1"
                           type => "type"
                           codec => json
                           access_key_id => "xx"
                           secret_access_key => "yy"
                }
            }
           filter {
                      date {
                           match => [ "timestamp", "UNIX_MS" ]
                       }
                  .............

We are getting _dateparsefailure tag on each record and our @timestamp field is the actual time of logstash processing the record (which is not what we want) and not matching with the unix timestamp that is in the log itself. Can someone help us with this?

We have an update. The conf file logstash was using was located in a different folder. After editing the correct conf file, the dateparsing is working correctly but the records are still tagged as _dateparsefailure.
It is not a big deal anymore but would love to know on why this tag still is being applied to our elastic docs. Thanks.

Are you pointing path.config at a directory? If you are then every file in the directory will be included in the configuration, so if another file has a different date filter that may be adding the tag.

I don't believe that is the case. I have 4 config files in total and the date parsing is working correctly in all four cases. However two of them are generating this tag (both config files are very similar) even when the parsing appears to be working correctly.

If you have 4 config files are you using pipelines.yml to run them in four different pipelines? If not, every event goes through every filter.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.