Logstash Date Filter Not Recognizing the pattern

Hello there,

I have a log file in CSV format (but semicolon separated instead of comma) that I generated from PowerShell, where one of the fields contain a date. Using grok, I managed to pull the date and store them into a field named "LastModified"

A sample of a line would be like these:

"ghijkl";db-backup-archive;Config/;2018-05-01 12:08:12;Amazon.S3.Model.Owner;0MB;GLACIER
"abcedf";db-backup-archive;DB_Test/DB_Test_backup_2018_05_01_020004_1938647.bak;2018-05-01 12:20:19;Amazon.S3.Model.Owner;3.27MB;GLACIER

So the date would be:

2018-05-01 12:08:12
2018-05-01 12:20:19

Now, I thought if I had my date filter to match "yyyy-MM-dd HH:mm:ss" pattern, it would work no problem, but I keep getting the _dateparsefailure tag instead.

Here's my full logstash config

# The # character at the beginning of a line indicates a comment. Use comments to describe your configuration.
input {
    beats {
        port => "5044"
    }
}

filter {

	## This prevents duplicate data from being added, using "message" field as the hash key
	fingerprint{
		source => "message"
		target => "[@metadata][fingerprint]"
		method => "SHA256"
		key => "Empirics"
	}

	if [fields][logtype] == "BackupStatus" {
		grok {
			match => { "message" => "\A(?<ETag>[^;]+);(?<BucketName>[^;]+);(?<DBName>[^/]+)/(?<Filename>[^;]+);(?<LastModified>[^;]+);(?<Owner>[^;]+);(?<Size>[^;]+);(?<StorageClass>[^;]+)" }
			tag_on_failure => ["warning_unparsed_backupstatus"]
		}
	} else {
		mutate {
			add_tag => [ "warning_not_grok_parsed" ]
		}
	}
	
	## If failed to parse, drop the data
	#if "warning_unparsed_backupstatus" in [tags] {
	#	drop { }
	#}

	date {
		match => [ "LastModified", "yyyy-MM-dd HH:mm:ss" , "yyyy-MM-dd HH:mm:ss " , " yyyy-MM-dd HH:mm:ss" , "yyyy-MM-dd  HH:mm:ss" ]
	}

}

output {
	elasticsearch {
		hosts => ["127.0.0.1:9200"]
		document_id => "%{[@metadata][fingerprint]}"
		index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}" 
	}
}

The grok part certainly worked because I can see the LastModified fields, along with other fields I defined in the grok match pattern.

Anyone can help me with this?
If it helps, I'm on logstash 6.3.2

Discover
As you can see, it manages to get all of the fields and I can see them on Kibana Discovery (I crossed out some fields but they are definitely there)

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.