Datatable visualization on max time taken URLs from nginx access log


(Tony Thomas) #1

Hi,

I am new to ELK and creating dashboards with ELK for nginx access logs. I found its pretty useful on creating nice visualizations. On my requirements one more datatable is pending, the requirement is given below (I am using the deafault logstash-* index).

Each NGINX access log provides the URLs been hit by user. I need to create a datatable containing only the URLs which took maximum response time more than one second with the count of URLs. So the table should contain 2 columns.

  1. URL
  2. Total count of URLs

Workarounds: I can sort out URLs based on max response time. But te list contains URLs for which reponse time is less than 1 sec too. I want to filter out the URLs with max response time more than 1 sec.

I tried the details given in below link and it didnt help out: https://www.elastic.co/guide/en/beats/packetbeat/current/kibana-queries-filters.html#_range_queries

Please help me out to solve it


(Shaunak Kashyap) #2

Hi @tony_thomas,

What you are trying to achieve should be possible to do in Kibana. Could you please provide two pieces of information to help me figure it out?

  1. The exact range query that you tried (but didn't help out), and

  2. The request that was sent from Kibana to Elasticsearch. To get this, find a little up arrow on the page you are on in Kibana. It looks like this: . Click it to open a "spy panel". Click on the "Request" tab in this spy panel and paste its contents in this post.


(Tony Thomas) #3

Hi @shaunak,

Thanks for your reply.

I have added 2 screenshots in which you can see my Kibana configurations.
I have improved my data table by including the range and is working fine now. Please check the ScreenShot_1.jpg.

In resulting datatable you can see the last two rows, which is not required in the result set for me. (the table contains more rows in the next page, all containg rows less than 1000 ms response time, which are nto required in table). Please suggest me on how to avoid this.

Adding below the json payload to elastic search too as required by you:

{"index":["logstash-2016.12.14","logstash-2016.12.13","logstash-2016.12.07","logstash-2016.12.09","logstash-2016.12.08","logstash-2016.12.10","logstash-2016.12.12","logstash-2016.12.11"],"ignore_unavailable":true,"preference":1481622811914}
{"query":{"bool":{"must":[{"query_string":{"analyze_wildcard":true,"query":"*"}},{"range":{"@timestamp":{"gte":1481110922810,"lte":1481715722810,"format":"epoch_millis"}}}],"must_not":[]}},"size":0,"aggs":{"2":{"terms":{"field":"rq_regngx_pagecount.keyword","size":50,"order":{"2-orderAgg":"desc"}},"aggs":{"6":{"range":{"field":"rq_dur_regngx_ms","ranges":[{"from":1000,"to":200000}],"keyed":true},"aggs":{"5":{"max":{"field":"rq_dur_regngx_ms"}}}},"2-orderAgg":{"max":{"field":"rq_dur_regngx_ms"}}}}}}


(Shaunak Kashyap) #4

Hi @tony_thomas,

Unfortunately the functionality you are asking for is not currently possible in Kibana. It would required the use of the Bucket Selector aggregation that is available in Elasticsearch but has not yet been supported in Kibana.


(system) #5

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.