Mysql slow log

Hi all, I am not able to successfully parse Mysql's slow log using logstash.

The log file:

# Time: 2018-02-27T09:20:14.122543Z
# User@Host: user[user] @  [nnn.nnn.nnn.nn]  Id:   148
# Query_time: 10.275441  Lock_time: 0.000000 Rows_sent: 0  Rows_examined: 0
use somedb;
SET timestamp=1519723214;
INSERT into sometable values ............<**an extreeeemely long line**>.....
...
...etc
# Time: 2023-09-26T10:35:06.833127Z
# User@Host: user[user] @  [nnn.nnn.nnn.nn]  Id: 5724051
# Query_time: 3.335550  Lock_time: 0.000094 Rows_sent: 1  Rows_examined: 2792117
SET timestamp=1695724506;
SELECT FIELD_ID,ID,ISSUE_ID,LOCK_HASH,LOCK_TIME,`RANK`,TYPE FROM SomeTable WHERE FIELD_ID = 12345 AND BUCKET = 1 ORDER BY `RANK` DESC LIMIT 1;
# Time: 2023-09-26T10:35:49.178675Z
# User@Host: user[user] @  [nnn.nnn.nnn.nn]  Id: 5724042
# Query_time: 3.780189  Lock_time: 0.000060 Rows_sent: 1  Rows_examined: 4116845
SET timestamp=1695724549;
SELECT COUNT(*) FROM SomeTable WHERE FIELD_ID = 12345 AND BUCKET = 2;

I've tried

input {
  file {
    path => "/path/to/the/slow.log"
    start_position => "beginning"
    codec => multiline {
      pattern => "^# Time:"
      negate => true
      what => "previous"
    }
  }
}
filter {

mutate { #this might not be needed
        gsub => ['message', "\n", " "]
    }

  grok {
    match => { "message" => "^# Time:\s+%{TIMESTAMP_ISO8601:timestamp}\s+%{GREEDYDATA:message}" }
  }
}
output {
        file {
                path => "/path/to/outputfile"
        }
}

But the output file just doesn't get created !
I can read the input file and I just need the filter to work in the simplest way possible so as to send these lines as is, to an index....
Extra field splitting is not really needed.

Can someone please help me create these output lines?
Thanks so very much.

I even tried this after looking at another forum member's question/answer:
Here I look for Time to start the pattern,

input {
  file {
    path => "/path/to/the/log"
    start_position => "beginning"
    codec => multiline {
      pattern => "^# Time:"
      negate => true
      what => "previous"
      auto_flush_interval => 2
    }
  }
}

mutate {
        gsub => ['message', "\n", " "]
}

  grok {
        match => [ "message" , "%{GREEDYDATA}Time:%{SPACE}%{TIMESTAMP_ISO8601:mysql_slow_querydate} %{GREEDYDATA}User@Host: (?:%{USERNAME:mysql_slow_clientuser})\[(?:%{DATA:mysql_cluster})\] @ %{SPACE}\[(?:%{DATA:mysql_slow_clientip})\]%{SPACE}Id:%{SPACE}%{NUMBER:mysql_slow_id} %{GREEDYDATA}Query_time: %{NUMBER:mysql_slow_querytime:float}(?:%{SPACE})Lock_time: %{NUMBER:mysql_slow_locktime:float}(?:%{SPACE})Rows_sent: %{NUMBER:mysql_slow_rowssent:int}(?:%{SPACE})Rows_examined: %{NUMBER:mysql_slow_rowsexamined:int}%{GREEDYDATA:mysql_slow_query};" ]
}

because sometimes the line below is missing

use somedb;

So, after Rows Examined I grab all the rest into GREEDYDATA.

However the grok is still not working as there is no output file created.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.