How parse mysql slow queries into elastic

Hi,

I have a mysql in GCP to which I have already configured a sink to pubsub and from a VM with filebeat installed I am able to send the slow queries that reach me to the pubsub to elastic.

My problem is that the slow queries log comes line by line and I would like to group all the slow queries in a single document.

# Time: 2022-04-13T10:06:52.167264Z
# User@Host: service_db_usr[service_db_usr] @ [10.72.3.x] thread_id: 253176 server_id: 4054104550
# Query_time: 0.110813 Lock_time: 0.000063 Rows_sent: 189 Rows_examined: 391226
use mydatabase;
SET timestamp=1649844412;
select pipelinein0_.id as id1_24_, pipelinein0_.pipeline_definition_id as pipeline3_24_, pipelinein0_.state as state2_24_ from pipeline_instance pipelinein0_ where pipelinein0_.state='NOT_COMPLETED';

But if I do that, I wonder if I'll be able to do operations with them later, I mean eg. tell me the top10 of slowqueries. Because if I have all the lines in 1 single document will I be able to do that?

If yes, how could I group these lines into one? because unlike the examples I don't have a physical log and I don't have inputs, I only have the pubsub module configured in my filebeat.

  • type: gcp-pubsub

Thank you very much

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.