Filebeat mysql slowlog pipeline.json decoding failure


I have a problem with mysql slowlog parsing. I tried to change mysql slowlog pipeline.json file so it could parse my slowlogs. Now it looks like this:

  "description": "Pipeline for parsing MySQL slow logs.",
  "processors": [{
    "grok": {
      "field": "message",
        "^# User@Host: %{USER:mysql.slowlog.user}(\\[[^\\]]+\\])? @ %{} \\[(%{IP:mysql.slowlog.ip})?\\](\\s*Id:\\s* %{})?\n# Thread_id: %{NUMBER:mysql.slowlog.thread_id}\s* Schema:%{DATA:mysql.slowlog.schema}\s* QC_hit: %{DATA:mysql.slowlog.qc_hit}\n# Query_time: %{NUMBER:mysql.slowlog.query_time.sec}\\s* Lock_time: %{NUMBER:mysql.slowlog.lock_time.sec}\\s* Rows_sent: %{NUMBER:mysql.slowlog.rows_sent}\\s* Rows_examined: %{NUMBER:mysql.slowlog.rows_examined}\n# Rows_affected: %{NUMBER:mysql.slowlog.rows_affected}\n(SET timestamp=%{NUMBER:mysql.slowlog.timestamp};\n)?%{GREEDYMULTILINE:mysql.slowlog.query}"
      "pattern_definitions" : {
        "GREEDYMULTILINE" : "(.|\n)*"
      "ignore_missing": true
  }, {
      "field": "message"
  }, {
    "date": {
      "field": "mysql.slowlog.timestamp",
      "target_field": "@timestamp",
      "formats": ["UNIX"],
      "ignore_failure": true
  }, {
    "gsub": {
      "field": "mysql.slowlog.query",
       "pattern": "\n# Time: [0-9]+ [0-9][0-9]:[0-9][0-9]:[0-9][0-9](\\.[0-9]+)?$",
       "replacement": "",
       "ignore_failure": true
  "on_failure" : [{
    "set" : {
      "field" : "error.message",
      "value" : "{{ _ingest.on_failure_message }}"

But when I try to use it i see an error:

2018-03-13T21:52:24.643+0200 ERROR pipeline/output.go:74 Failed to connect: Connection marked as failed because the onConnect callback failed: Error getting pipeline for fileset mysql/slowlog: Error JSON decoding the pipeline file: ingest/pipeline.json: invalid character 's' in string escape code

Maybe I made some mistakes in Grok pattern? My logs that I try to parse looks like this:

# User@Host: root[root] @ localhost []
# Thread_id: 3  Schema:   QC_hit: No
# Query_time: 5.007341  Lock_time: 0.000000  Rows_sent: 1  Rows_examined: 0
# Rows_affected: 0
SET timestamp=1520962678;
select sleep(5);

Can you please help me find the problem?

As the error states, your document above is not valid json. It run it quickly through a json validator and it seems to complain about the \s part. As soon as this is removed, it is valid. I wonder if you need to double escape here?

So I shouldn't use \s?
BTW, when should I double escape then?

I think there are 2 things here:

All the filebeat does is taking the json document and trying to load it into ES.

I don't know if double escaping works here it's just a guess. For testing I recommend you to load the json doc via curl or similar directly into ES and you will get a direct response. To check if it's valid json I used my editor to validate it but there are many web pages out there that you can paste json in to check if it's valid.

So first step is to get a valid json and then see if the pattern still works.

Thanks, I think I found the mistake. I found that I escaped some \s once. I changed it and now it is a valid JSON. And it loads successfully.

Thank you for the help.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.