Hi all, I am not able to successfully parse Mysql's slow log using logstash.
The log file:
# Time: 2018-02-27T09:20:14.122543Z
# User@Host: user[user] @ [nnn.nnn.nnn.nn] Id: 148
# Query_time: 10.275441 Lock_time: 0.000000 Rows_sent: 0 Rows_examined: 0
use somedb;
SET timestamp=1519723214;
INSERT into sometable values ............<**an extreeeemely long line**>.....
...
...etc
# Time: 2023-09-26T10:35:06.833127Z
# User@Host: user[user] @ [nnn.nnn.nnn.nn] Id: 5724051
# Query_time: 3.335550 Lock_time: 0.000094 Rows_sent: 1 Rows_examined: 2792117
SET timestamp=1695724506;
SELECT FIELD_ID,ID,ISSUE_ID,LOCK_HASH,LOCK_TIME,`RANK`,TYPE FROM SomeTable WHERE FIELD_ID = 12345 AND BUCKET = 1 ORDER BY `RANK` DESC LIMIT 1;
# Time: 2023-09-26T10:35:49.178675Z
# User@Host: user[user] @ [nnn.nnn.nnn.nn] Id: 5724042
# Query_time: 3.780189 Lock_time: 0.000060 Rows_sent: 1 Rows_examined: 4116845
SET timestamp=1695724549;
SELECT COUNT(*) FROM SomeTable WHERE FIELD_ID = 12345 AND BUCKET = 2;
I've tried
input {
file {
path => "/path/to/the/slow.log"
start_position => "beginning"
codec => multiline {
pattern => "^# Time:"
negate => true
what => "previous"
}
}
}
filter {
mutate { #this might not be needed
gsub => ['message', "\n", " "]
}
grok {
match => { "message" => "^# Time:\s+%{TIMESTAMP_ISO8601:timestamp}\s+%{GREEDYDATA:message}" }
}
}
output {
file {
path => "/path/to/outputfile"
}
}
But the output file just doesn't get created !
I can read the input file and I just need the filter to work in the simplest way possible so as to send these lines as is, to an index....
Extra field splitting is not really needed.
Can someone please help me create these output lines?
Thanks so very much.