Ingest pipeline "Failed to parse content to map"

Why is this log causing an error and what is the meaning of "Failed to parse content to map". I am pretty new to pipelines and ES in general so any help would be greatly appreciated:

INPUT

POST _ingest/pipeline/_simulate
{
    "pipeline": {
        "description": "...",
        "processors": [
            {
  "gsub": {
    "field": "message",
    "pattern": "[:,{}]",
    "replacement": " "
  }
},
{
  "dissect": {
    "field": "message",
    "append_separator": ":", 
    "pattern" : "%{information} %{thread} %{date} %{+time/1} %{+time/2} %{+time/3} %{app} %{} %{method} %{} %{} %{} %{scope} %{} %{localUserEmail} %{} %{} %{cellName}"
   }
},
{
"remove": {
    "field": "message"
  }
}
]
    },
    "docs": [
        {
            "_source": {
                "message": "Information","thread","06/10/21","18:26:06","App{"method":"build","_meta":{"scope":"local","localUserEmail":"my@email.co.uk"},"cellName":"BoardPageSectionContentRecord"}"
            }
        }
    ]
}

OUTPUT

{
  "error" : {
    "root_cause" : [
      {
        "type" : "parse_exception",
        "reason" : "Failed to parse content to map"
      }
    ],
    "type" : "parse_exception",
    "reason" : "Failed to parse content to map",
    "caused_by" : {
      "type" : "json_parse_exception",
      "reason" : "Unexpected character (',' (code 44)): was expecting a colon to separate field name and value\n at [Source: (byte[])\"{\n    \"pipeline\": {\n        \"description\": \"...\",\n        \"processors\": [\n            {\n  \"gsub\": {\n    \"field\": \"message\",\n    \"pattern\": \"[:,{}]\",\n    \"replacement\": \" \"\n  }\n},\n{\n  \"dissect\": {\n    \"field\": \"message\",\n    \"append_separator\": \":\", \n    \"pattern\" : \"%{information} %{thread} %{date} %{+time/1} %{+time/2} %{+time/3} %{app} %{} %{method} %{} %{} %{} %{scope} %{} %{localUserEmail} %{} %{} %{cellName}\"\n   }\n},\n{\n\"remove\": {\n    \"field\": \"message\"\n  }\n}\n]\n    },\n    \"docs\": [\n        \"[truncated 258 bytes]; line: 29, column: 51]"
    }
  },
  "status" : 400
}
  • I found adding two more quotes to the beginning and end of the log allowed parsing to take place (Not sure why that is?), which is why I've configured the pipelines as I have so far. I've tried using other pipelines without the additional quotes but to no avail.

What does the log entry you are pushing into the pipeline look like?

This is an example log entry:

"Information","thread","06/10/21","18:26:06","App{"method":"build","_meta":{"scope":"local","localUserEmail":"my@email.co.uk"},"cellName":"BoardPageSectionContentRecord"}"

Can you share a reproducible example, that works on contains valid JSON? I have put three double ticks in Kibana around your string and then everything works, but now I don't know if that reflects your setup.

Also, can you provide the version you tested with? I run on 7.13.2, maybe you can upgrade and retry as well.

Thank you!

Thank you for your response!

I raised this Topic whilst using 7.13.0. I have just now upgraded to 7.13.2 and got the same outcome.

Please see below string with valid JSON (triple quoted). The OUTPUT is pretty much what we're after apart from the additional backslashes:

INPUT

POST _ingest/pipeline/_simulate
{
    "pipeline": {
        "description": "...",
        "processors": [
            {
  "gsub": {
    "field": "message",
    "pattern": "[:,{}]",
    "replacement": " "
  }
},
{
  "dissect": {
    "field": "message",
    "append_separator": ":", 
    "pattern" : "%{information} %{thread} %{date} %{+time/1} %{+time/2} %{+time/3} %{app} %{} %{method} %{} %{} %{} %{scope} %{} %{localUserEmail} %{} %{} %{cellName}"
   }
},
{
"remove": {
    "field": "message"
  }
}
]
    },
    "docs": [
        {
            "_source": {
                "message": """Information","thread","06/10/21","18:26:06","App{"method":"build","_meta":{"scope":"local","localUserEmail":"my@email.co.uk"},"cellName":"BoardPageSectionContentRecord"}"""
            }
        }
    ]
}

OUTPUT

{
  "docs" : [
    {
      "doc" : {
        "_index" : "_index",
        "_type" : "_doc",
        "_id" : "_id",
        "_source" : {
          "date" : "\"06/10/21\"",
          "app" : "\"App",
          "localUserEmail" : "\"my@email.co.uk\"",
          "method" : "\"build\"",
          "cellName" : "\"BoardPageSectionContentRecord\" ",
          "scope" : "\"local\"",
          "information" : "Information\"",
          "thread" : "\"thread\"",
          "time" : "\"18:26:06\""
        },
        "_ingest" : {
          "timestamp" : "2021-06-29T10:41:19.491221958Z"
        }
      }
    }
  ]
}

I'm sorry, but I don't understand. The output now does not contain an exception, so why are you writing you got the same outcome?

I'm sorry I wasn't clear, I tested without adding the triple quotes around the string in version 7.13.2 and got the same outcome ("parse_exception") I posted in my Topic introduction.

You asked for a reproducible example that contains valid JSON. The only way I can get a valid JSON at the moment with my log is to add triple quotes around the string, which is what you can see above.

I hope I have understood you correctly?

Also I just wanted to add that my log is a trace message written in standard ColdFusion log format, just incase that helps.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.