Dear Elastic,
I am having problems processing this grok into the ingest pipeline.
PUT _ingest/pipeline/parse_caddy
{
"description": "parsing fields from Caddy",
"processors": [
{
"grok" : {
"field": "message",
"patterns": ["%{IP:remote} - %{USER:user} \[%{HTTPDATE:time}\] \"%{WORD:method} %{URIPATHPARAM:uripath} %{WORD:proto}\/%{NUMBER:proto_ver}\" %{NUMBER:status_code} %{NUMBER:body_length}"]
}
}
]
Here's the test data for groking the data with:
8.8.8.8 - - [07/Jul/2020:11:23:42 +0000] "GET / HTTP/2.0" 304 0
8.8.8.8 - - [07/Jul/2020:11:23:43 +0000] "GET /site.js HTTP/2.0" 304 0
However Grok does not want to work with [%{HTTPDATE:time}] and struggles to understand what "[" is.
After chatting on Slack with an engineer (Cheers Ben!) he suggested using
"%{IP:remote} - %{USER:user} \\[%{HTTPDATE:time}\\] \"%{WORD:method} %{URIPATHPARAM:uripath} %{WORD:proto}\/%{NUMBER:proto_ver}\" %{NUMBER:status_code} %{NUMBER:body_length}"
However after running this new grok I got the following error:
(status=400): {"type":"mapper_parsing_exception","reason":"object mapping for [user] tried to parse field [user] as object, but found a concrete value"}
Any suggestion on how I can get Elasticsearch to co-operate? I have run this in grokdebug and it worked aboslutely fine so I'm rather confused...