I've got an Elasticsearch pipeline grok processor to pull the timestamp out of a log file. Super simple. I tested it in the grok debugger and it works fine.
The pipeline is super simple:
{
"description" : "Parse Date from SQL Filebeat message",
"processors" : [
{
"grok": {
"field": "message",
"patterns": ["%{TIMESTAMP_ISO8601:log_timestamp}"],
"ignore_failure": true
}
}
]
}
As is my input:
2019-01-11 16:40:51.54 Logon Error: 18456, Severity: 14, State: 5.
2019-01-11 16:40:51.54 Logon Login failed for user 'abcdefg'. Reason: Could not find a login matching the name provided. [CLIENT: ]
2019-01-12 00:00:54.55 spid48s This instance of SQL Server has been using a process ID of 5660 since 1/9/2019 11:02:00 AM (local) 1/9/2019 4:02:00 PM (UTC). This is an informational message only; no user action is required.
However the filebeats log shows:
2019-01-11T15:50:52.107-0500 DEBUG [elasticsearch] elasticsearch/client.go:526 Bulk item insert failed (i=8, status=500): {"type":"exception","reason":"java.lang.IllegalArgumentException: java.lang.IllegalArgumentException: Provided Grok expressions do not match field value: [\u00002\u00000\u00001\u00009\u0000-\u00000\u00001\u0000-\u00001\u00001\u0000 \u00001\u00005\u0000:\u00004\u00008\u0000:\u00001\u00006\u0000.\u00001\u00004\u0000 \u0000L\u0000o\u0000g\u0000o\u0000n\u0000 \u0000 \u0000 \u0000 \u0000 \u0000 \u0000 \u0000E\u0000r\u0000r\u0000o\u0000r\u0000:\u0000 \u00001\u00008\u00004\u00005\u00006\u0000,\u0000 \u0000S\u0000e\u0000v\u0000e\u0000r\u0000i\u0000t\u0000y\u0000:\u0000 \u00001\u00004\u0000,\u0000 \u0000S\u0000t\u0000a\u0000t\u0000e\u0000:\u0000 \u00005\u0000.\u0000\r\u0000]","caused_by":{"type":"illegal_argument_exception","reason":"java.lang.IllegalArgumentException: Provided Grok expressions do not match field value: [\u00002\u00000\u00001\u00009\u0000-\u00000\u00001\u0000-\u00001\u00001\u0000 \u00001\u00005\u0000:\u00004\u00008\u0000:\u00001\u00006\u0000.\u00001\u00004\u0000 \u0000L\u0000o\u0000g\u0000o\u0000n\u0000 \u0000 \u0000 \u0000 \u0000 \u0000 \u0000 \u0000E\u0000r\u0000r\u0000o\u0000r\u0000:\u0000 \u00001\u00008\u00004\u00005\u00006\u0000,\u0000 \u0000S\u0000e\u0000v\u0000e\u0000r\u0000i\u0000t\u0000y\u0000:\u0000 \u00001\u00004\u0000,\u0000 \u0000S\u0000t\u0000a\u0000t\u0000e\u0000:\u0000 \u00005\u0000.\u0000\r\u0000]","caused_by":{"type":"illegal_argument_exception","reason":"Provided Grok expressions do not match field value: [\u00002\u00000\u00001\u00009\u0000-\u00000\u00001\u0000-\u00001\u00001\u0000 \u00001\u00005\u0000:\u00004\u00008\u0000:\u00001\u00006\u0000.\u00001\u00004\u0000 \u0000L\u0000o\u0000g\u0000o\u0000n\u0000 \u0000 \u0000 \u0000 \u0000 \u0000 \u0000 \u0000E\u0000r\u0000r\u0000o\u0000r\u0000:\u0000 \u00001\u00008\u00004\u00005\u00006\u0000,\u0000 \u0000S\u0000e\u0000v\u0000e\u0000r\u0000i\u0000t\u0000y\u0000:\u0000 \u00001\u00004\u0000,\u0000 \u0000S\u0000t\u0000a\u0000t\u0000e\u0000:\u0000 \u00005\u0000.\u0000\r\u0000]"}},"header":{"processor_type":"grok"}}
I had to put the ignore failure in there so at least I can get the logs into Elastic for now, but I need to be able to pull the timestamp for proper processing of the logs.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.