Display concurrency in data on Kibana

I have fields with start date and duration (seconds) representing start and duration of a phone call. I would like to define concurrency (a number) when the same destination (also a field) is active within the same duration.

That is if a ncall starts now for 30 seconds and another call to the same destination starts 15 seconds into the first call, concurrency should be 2 for the remaining 15 seconds.

How do I define this in Kibana visualization as a line graph over time?

3 Likes

This is very difficult and probably not the right tool for the task.

In order to do this you would have to write a script withing ES that computes the concurrency by aggregating on the destination and calculate it for every event.

The easy way to do this is to write the concurrency to the log files when you record the values - so whenever a call starts check if a call already exist (and how many of them exist) and write it in the logs. Then you can easily graph it.

Hope this helps.

-- Asaf.

You can actually do this if you query Elasticsearch directly by using a script on a date_histogram aggregation which adds the call document to all the buckets from the start_time to start_time + duration. The following gist contains a Sense recreation that details what I mean: https://gist.github.com/jpountz/cebb8452648c36099cee

As for doing this in Kibana, I know it's possible to run scripts in Kibana for some actions but I'm not sure if this would be possible, maybe someone more familiar with Kibana could comment? At the very least you would probably need to have a file script for this as Kibana won't let you run inline groovy scripts.

Hope that helps

2 Likes

Wow, nice script @colings86 and @jpountz

@EricK if you have scripting enabled in elasticsearch you can use it within your date_histogram by overriding the aggregations parameters with the "JSON input" advanced config option. This would look something like the screenshot below (note the field: null bit which removes the field parameter from the params):

2 Likes

This is awesome!

Why are we multiplying duration by 1000? Also is "interval" a kibana variable? Lastly, can't seem to locate documentation on use of "Json input" field this way - explains the varied responses.

Any documentation pointers would be useful.

Tried the script in Kibana and getting no results. Anything i'm doing wrong?

Default scripting language in Kibana is Lucene Expressions, so you have to specify {"script" : "blah", "lang" : "groovy"} in your JSON input.

You can also use static scripts, if you don't want to use dynamic - more on that here: Calling groovy script from Kibana

1 Like

Hi,

Thank you Tanya for the link. I did all this and kibana sends the following request to elasticsearch:

 "aggs": {
    "3": {
      "date_histogram": {
        "field": "recordStart",
        "interval": "minute",
        "pre_zone": "+02:00",
        "pre_zone_adjust_large_interval": true,
        "min_doc_count": 1,
        "extended_bounds": {
          "min": 1414574705856,
          "max": 1415722155456
        },
        "script": "start = doc['recordStart'].value; duration = doc['recordDuration'].value; l = []; for (long i = 0; i < duration; i += 60000) { l.add(start + i); }; return l;",
        "lang": "groovy"
      }
    }
  }

This query returns an error. When I remove the "field" property and rerun it in elasticsearch it works fine. Is this a bug in kibana that it adds the "field" property as well as the script? Is there a workaround?

What kind of error do you get?

{

"error": "SearchPhaseExecutionException[Failed to execute phase [query], all shards failed; shardFailures {[NZCzFCLeQiOQrSjvhSWi8A][am][0]: QueryPhaseExecutionException[[am][0]: query[ConstantScore(:)],from[0],size[10]: Query Failed [Failed to execute main query]]; nested: ClassCastException[java.util.ArrayList cannot be cast to java.lang.Number]; }{[NZCzFCLeQiOQrSjvhSWi8A][am][1]: QueryPhaseExecutionException[[am][1]: query[ConstantScore(:)],from[0],size[10]: Query Failed [Failed to execute main query]]; nested: ClassCastException[java.util.ArrayList cannot be cast to java.lang.Number]; }{[NZCzFCLeQiOQrSjvhSWi8A][am][2]: QueryPhaseExecutionException[[am][2]: query[ConstantScore(:)],from[0],size[10]: Query Failed [Failed to execute main query]]; nested: ClassCastException[java.util.ArrayList cannot be cast to java.lang.Number]; }{[NZCzFCLeQiOQrSjvhSWi8A][am][3]: QueryPhaseExecutionException[[am][3]: query[ConstantScore(:)],from[0],size[10]: Query Failed [Failed to execute main query]]; nested: ClassCastException[java.util.ArrayList cannot be cast to java.lang.Number]; }{[NZCzFCLeQiOQrSjvhSWi8A][am][4]: QueryPhaseExecutionException[[am][4]: query[ConstantScore(:)],from[0],size[10]: Query Failed [Failed to execute main query]]; nested: ClassCastException[java.util.ArrayList cannot be cast to java.lang.Number]; }]",
"status": 500
}

In re-reading Spencer's post below, he suggests an additional parameter to the script to remove the field property. I see that you're missing that in your script. Could you try adding that and see what happens?

I also tried this but it doesn't remove the field parameter as promised. With the following json input:

{
"interval":"minute",
"script": "start = doc['recordStart'].value; duration = doc['recordDuration'].value; l = []; for (long i = 0; i < duration; i += 60000) { l.add(start + i); }; return l;",
"field":null,
"lang": "groovy"

}

I get the following request

 "aggs": {
"2": {
  "date_histogram": {
    "field": null,
    "interval": "minute",
    "pre_zone": "+02:00",
    "pre_zone_adjust_large_interval": true,
    "min_doc_count": 1,
    "extended_bounds": {
      "min": 1366292067758,
      "max": 1433416997446
    },
    "script": "start = doc['recordStart'].value; duration = doc['recordDuration'].value; l = []; for (long i = 0; i < duration; i += 60000) { l.add(start + i); }; return l;",
    "lang": "groovy"
  }
}

}

Which gives the following error

Error: Request to Elasticsearch failed: (big part removed because message was too long)

Failure [Unexpected token VALUE_NULL in [2].]]; }]"}
at http://localhost:5601/index.js?_b=7489:43092:38
at Function.Promise.try (http://localhost:5601/index.js?_b=7489:46434:26)
at http://localhost:5601/index.js?_b=7489:46412:27
at Array.map (native)
at Function.Promise.map (http://localhost:5601/index.js?_b=7489:46411:30)
at callResponseHandlers (http://localhost:5601/index.js?_b=7489:43064:22)
at http://localhost:5601/index.js?_b=7489:43182:16
at wrappedCallback (http://localhost:5601/index.js?_b=7489:20893:81)
at wrappedCallback (http://localhost:5601/index.js?_b=7489:20893:81)
at http://localhost:5601/index.js?_b=7489:20979:26

It worked for me with the following data and config:

curl -XPOST 'http://localhost:9200/test3' -d '{
    "settings" : {
        "number_of_shards" : 1
    },
    "mappings" : {
        "test" : {
            "properties" : {
                "start_date" : { "type" : "date"},
                "end_date" : { "type" : "date"},
                "duration" : { "type" : "integer"}
            }
        }
    }
}'

curl -XPUT 'http://localhost:9200/test3/test/1' -d '{
    "start_date" : "2015-08-23T00:01:00",
    "end_date" : "2015-08-23T14:02:00",
    "duration" : 1
}'

curl -XPUT 'http://localhost:9200/test3/test/2' -d '{
    "start_date" : "2015-08-23T00:01:30",
    "end_date" : "2015-08-23T14:02:30",
    "duration" : 1
}'

curl -XPUT 'http://localhost:9200/test3/test/3' -d '{
    "start_date" : "2015-08-23T00:01:45",
    "end_date" : "2015-08-23T14:02:45",
    "duration" : 1
}'

curl -XPUT 'http://localhost:9200/test3/test/4' -d '{
    "start_date" : "2015-08-23T00:01:55",
    "end_date" : "2015-08-23T14:02:55",
    "duration" : 1
}'

curl -XPUT 'http://localhost:9200/test3/test/5' -d '{
    "start_date" : "2015-08-23T00:02:00",
    "end_date" : "2015-08-23T14:04:00",
    "duration" : 1
}'

Script:

{
"interval":"minute",
"script": "start = doc['start_date'].value; duration = doc['duration'].value*1000*1000; l = []; for (long i = 0; i < duration; i += 60000) { l.add(start + i); }; return l;",
"field": null,
"lang": "groovy"
}

1 Like

Very strange, I've followed your case with exactly the same data and config and my kibana throws the same error I mentioned earlier. I'm using:

Kibana server: 4.1.1
Elasticsearch: 1.7.1

Is it possible that I've got some settings wrong?

Ah - I'm running ES 2.0 beta 1 (just released yesterday) and a snapshot build of Kibana 4.2 (which you can get at the bottom of the page here, and should be released as 4.2 beta soon).

When I run this on ES 1.6 and Kibana 4.1.1 I get the same error as you. Let me see if I can get some clarification on that.

I confirmed that the "field": null feature was introduced in the Kibana 4.2 timeframe, so that is why this is not working in Kibana 4.1.1. @joris_renkens assuming you're running in test / dev, would it be possible for you to use ES 2.0 beta and Kibana 4.2 beta for this project?

It worked! Do you have any idea when ES 2.0 and Kibana 4.2 will become stable?

Official Kibana 4.2.0-beta1 was released today: https://www.elastic.co/blog/kibana-4-2-beta-1-i-heard-you-like-betas

GA for ES 2.0 and Kibana 4.2 depends a bit on the kind of feedback we get from the community and whether any significant changes are required, but we are currently targeting it to early fall :slight_smile:

1 Like

Hi,
Im sorry to bump this old thread. I think I have a similar problem.

I have fields "Start Timestamp" , "End Timestamp", "Status", and I would like to create a daily date histogram and count how many transactions there are per day per status. The duration of a transaction could be anything from one hour to 2-3 weeks so if a transaction of lets say 2 days should be included in the calculation in 2 bars. I dont know how to proceed with this. Any suggestions? (im using ES 5.2)
Br
Cristian

I'm in a similar situation except have start & end time instead of start & duration. I got the previous examples working in current versions of ES (5.2) with updated syntax, but.. Groovy is deprecated and requires enabling in elasticsearch.yml and then still throws log warnings about being deprecated.

[details=Groovy script for 5.2]{
GET test/_search
{
"aggs": {
"my_histo": {
"date_histogram": {
"script": {
"lang": "groovy",
"inline": "start = doc['start_date'].value; duration = doc['duration'].value10001000; l = []; for (long i = 0; i < duration; i += 60000) { l.add(start + i); }; return l;"
},
"interval": "minute"
}
}
}
}
}[/details]

So... anyone know if this is now possible in Timelion? It feels like it should be, but I'm just diving into Timelion for the first time and not finding a function that maps to this off the bat.

Or if not possible in Timelion yet then time to move on to Painless?