Parsing and search data in logstash and elasticsearch

This is an extract of my log file


I have parsed it and this is the result json


and the result as shown in the dashboard
30652806_1972359552835486_503403023591014400_n

Now , I want to determine the execution time of each request in each thread, like this:

QueryA
thread262 2 milliseconds (the first)

thread263 4 milliseconds

QueryB
thread262 1 milliseconds

Thank you!

How do you know that query A is 2 milliseconds and query B is 1 millisecond and not the other way around?

it's just a hypothesis : the first timestamp must be granted to the first query
Thank you

This does it the other way around, with the first timestamp going to the last query, but it should get you started

filter {
  dissect { mapping => [ "message", '%{ts} %{+ts} %{+ts} %{+ts} |%{} : %{} |%{}-[%{thread}]: %{} %{text}' ] }
  if [text] =~ /^Find/ {
    mutate { gsub => [ "text", "Find query :  : ", "" ] }
    aggregate {
      task_id => "%{thread}"
      code => "(map['queries'] ||= []).push(event.get('text'))"
    }
    drop {}
  }
  if [text] =~ /^Elapsed/ {
    mutate { gsub => [ "text", "ElapsedTime for Query Execution ", "", "text", " milliseconds", "" ] }
    mutate { convert => { "text" => "integer" } }
    aggregate {
      task_id => "%{thread}"
      code => "event.set('query', map['queries'].pop)"
      map_action => "update"
    }
  }
  date { match => [ "ts" , "MMM dd',' YYYY HH:mm:ss:SSS" ] }
}

You would need to think about how to do timeouts, otherwise this will leak memory. Also, make sure you set --pipeline.workers 1

1 Like

thankyou so much, it works

Can i assign a unique id to every query while parsing my log and show the really query with a pou-up in kibana ?
Thank you

You might want to ask a new question for that.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.