Unable to grok ingest pipeline for Caddy log (even after running in grok debugger)

Dear Elastic,

I am having problems processing this grok into the ingest pipeline.

PUT _ingest/pipeline/parse_caddy
  "description": "parsing fields from Caddy",
  "processors": [
      "grok" : {
        "field": "message",
        "patterns": ["%{IP:remote} - %{USER:user} \[%{HTTPDATE:time}\] \"%{WORD:method} %{URIPATHPARAM:uripath} %{WORD:proto}\/%{NUMBER:proto_ver}\" %{NUMBER:status_code} %{NUMBER:body_length}"]

Here's the test data for groking the data with: - - [07/Jul/2020:11:23:42 +0000] "GET / HTTP/2.0" 304 0 - - [07/Jul/2020:11:23:43 +0000] "GET /site.js HTTP/2.0" 304 0

However Grok does not want to work with [%{HTTPDATE:time}] and struggles to understand what "[" is.

After chatting on Slack with an engineer (Cheers Ben!) he suggested using

"%{IP:remote} - %{USER:user} \\[%{HTTPDATE:time}\\] \"%{WORD:method} %{URIPATHPARAM:uripath} %{WORD:proto}\/%{NUMBER:proto_ver}\" %{NUMBER:status_code} %{NUMBER:body_length}"

However after running this new grok I got the following error:

(status=400): {"type":"mapper_parsing_exception","reason":"object mapping for [user] tried to parse field [user] as object, but found a concrete value"}

Any suggestion on how I can get Elasticsearch to co-operate? I have run this in grokdebug and it worked aboslutely fine so I'm rather confused...


I believe this is not a problem with the Grok itself but the mapping.
As far as I can tell from the log, the mapping for the user field is currently an object where your grok pattern is making it a text which probably contains -.

You need to change your mapping or change the grok pattern for user field to something like %{USER:user.foo} where foo is the sub field name.

What does your mapping look like currently?

user.name fixed the mapping problem and I was able to parse data. Thank you for your help!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.