Filebeat / slow mysql log

Hi there,
I'm trying to ingest the "mysql slow log" into my elasticsearch.

As my mysql version doesn't match the PATTERN defined in /usr/share/filebeat/module/mysql/slowlog/ingest/pipeline.json, I've modified the pattern directly in this file

When I try to test it

rm -f /tmp/testreg.json; filebeat -e -v -c /etc/filebeat/filebeat.yml -E filebeat.registry_file=/tmp/testreg.json -E output.elasticsearch.enabled=false -E output.console.pretty=true

I still get

"# User@Host: admin[admin] @  [my.ip.addr.ess]\n# Query_time: 4.564669  Lock_time: 0.000038 Rows_sent: 0  Rows_examined: 1\nSET timestamp=1518674946;\nUPDATE blabla SET blabla=blabla+1 '\nWHERE ( blabla = '444') );\n# Time: 180215  7:09:09"

As you can see, the #Time 180215 is at the end of my message instead of the beginning.

My slowlog file has the following format:

# Time: 180215  7:09:09
# User@Host: admin[admin] @  [my.ip.addr.ess]
# Query_time: 4.564669  Lock_time: 0.000038 Rows_sent: 0  Rows_examined: 1
SET timestamp=1518674946;
UPDATE blabla SET blabla=blabla+1 WHERE ( blabla = '444') );

I've test the modified pattern in the DEV Tools and it work like a charm.

Do I need to re-build the mysql module somehow in filebeat?

Many thanks in advance
G.

If a pipeline is loaded once, it's not updated if it changes. You can delete the previous pipeline using the Dev Tools:

DELETE _ingest/pipeline/my-pipeline-id

Then FB will upload the new, modified pipeline you created.

Hi @kvch,
Thanks for your reply.

I've tried to delete the pipeline but I'm still facing the same issue.

Filebeat is still sending the "User@Host: xxx" at the beginning, instead of the "# Time: xxx" in the message field:

"# User@Host: admin[admin] @  [my.ip.addr.ess]\n# Query_time: 4.564669  Lock_time: 0.000038 Rows_sent: 0  Rows_examined: 1\nSET timestamp=1518674946;\nUPDATE blabla SET blabla=blabla+1 '\nWHERE ( blabla = '444') );\n# Time: 180219  7:09:09"

thanks
G.

Could you share the modified pipeline?

Sure!

{
  "filebeat-6.2.1-mysql-slowlog-pipeline" : {
    "description" : "Pipeline for parsing MySQL slow logs.",
    "processors" : [
      {
        "grok" : {
          "field" : "message",
          "patterns" : [
            "^# Time:%{INT:mysql.slowlog.date} %{SPACE} %{TIME:mysql.slowlog.time}\n# User@Host: %{USER:mysql.slowlog.user}(\\[[^\\]]+\\])? @ %{SPACE} \\[(%{IP:mysql.slowlog.ip})?\\](\\s*Id:\\s* %{NUMBER:mysql.slowlog.id})?\n# Query_time: %{NUMBER:mysql.slowlog.query_time.sec}\\s* Lock_time: %{NUMBER:mysql.slowlog.lock_time.sec}\\s* Rows_sent: %{NUMBER:mysql.slowlog.rows_sent}\\s* Rows_examined: %{NUMBER:mysql.slowlog.rows_examined}\n(SET timestamp=%{NUMBER:mysql.slowlog.timestamp};\n)?%{GREEDYMULTILINE:mysql.slowlog.query}"
          ],
          "pattern_definitions" : {
            "GREEDYMULTILINE" : "(.|\n)*"
          },
          "ignore_missing" : true
        }
      },
      {
        "remove" : {
          "field" : "message"
        }
      },
      {
        "date" : {
          "formats" : [
            "UNIX"
          ],
          "ignore_failure" : true,
          "field" : "mysql.slowlog.timestamp",
          "target_field" : "@timestamp"
        }
      },
      {
        "gsub" : {
          "field" : "mysql.slowlog.query",
          "pattern" : "\n# Time: [0-9]+ [0-9][0-9]:[0-9][0-9]:[0-9][0-9](\\.[0-9]+)?$",
          "replacement" : "",
          "ignore_failure" : true
        }
      }
    ],
    "on_failure" : [
      {
        "set" : {
          "field" : "error.message",
          "value" : "{{ _ingest.on_failure_message }}"
        }
      }
    ]
  }
}

Thanks for your help!

Hi,

The problem was in the file slowlog.yml
I had to adapt the multiline pattern in this file

Thanks for your help!
G.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.