Display concurency in Kibana 5.x

Has anyone been able to successfully replicate and visualize concurrency in data in Kibana 5.x?
I have start, end, and duration timestamps in each message and I'm trying to follow directions in the post from 2015 but getting a an error in Kibana which doesn't show what the problem is.

Trying to following the following instructions: Display concurrency in data on Kibana

Hi @zahodi,

could you be more specific about the specific error you are seeing?

@weltenwort,
Elasticsearch throws this error:
"Variable [start] is not defined."

This is the script the option that I"m adding to Advanced Json in Kibana:
{
"script": "start = doc['netflow']['StartSecondsMilliseconds'].value;
duration = doc['netflow']['flow_duration'].value * 1000;
l = [];
for (long i = 0; i < duration; i += 60000) { l.add(start + i); };
return l;"
}

Ah, it looks like you're still trying to use the old groovy script syntax. Starting with 5.0, Elasticsearch has introduced "painless" as the default scripting language. You could express that snippet in painless like this:

def stepSeconds = 60;
def start = Instant.ofEpochMilli(doc['netflow.StartSecondsMilliseconds'].value.millis);
def end = Instant.ofEpochMilli(doc['netflow.EndSecondsMilliseconds'].value.millis);
def dates = [];
for(def i = start; i.isBefore(end) || i.equals(end); i = i.plusSeconds(stepSeconds)) {
  dates.add(i);
}
return dates;

I guessed the end field name above, so it might not be 100% applicable to your scenario. But I hope this gives you an idea. I'd be happy to give more specific advice if you share more of your data schema.

@weltenwort Now kibana is throwing an error but I can't tell where the problem is or how to see the error. I' guessing something with syntax.

I think it does not support line breaks in that field. Could you try removing the line breaks within the inline field value?

That was it, now I'm back to elasticsearch throwing errors:
Caused by: java.lang.IllegalArgumentException: No field found for [netflow.StartSecondsMilliseconds] in mapping with types []

It looks like the assumptions I made about the field names are incorrect. Would it be possible for you to post the index mapping (obtained via GET /${INDEXNAME}/_mapping)?

Only pasting the mappings that are relevant to this topic. Let me know if more are needed:

"netflow" : {
"flowEndMilliseconds" : {
  "type" : "date"
},
"flowStartMilliseconds" : {
  "type" : "date"
}
  }

So according to the mapping snippet the field names in your script should be netflow.flowStartMilliseconds and netflow.flowEndMilliseconds.

Ah, good catch. Now getting another elasticsearch error:

Caused by: java.lang.IllegalArgumentException: Unable to find dynamic field [millis] for class [java.lang.Long].

Hm, it works for me, but could you try removing the trailing .millis to the right of the def start and def end statements?

New error:
Caused by: java.lang.ClassCastException: java.util.ArrayList cannot be cast to java.lang.Number

My code at this point:
{
"script" : {
"inline" : "def stepSeconds = 60; def start = Instant.ofEpochMilli(doc['netflow.flowStartMilliseconds'].value); def end = Instant.ofEpochMilli(doc['netflow.flowEndMilliseconds'].value); def dates = []; for(def i = start; i.isBefore(end) || i.equals(end); i = i.plusSeconds(stepSeconds)) { dates.add(i); } return dates;",
"lang": "painless"
}
}

It seems there are still some incorrect assumptions about the document values and mapping types in the code. Having a sample of the documents (the netflow.flowStartMilliseconds and netflow.flowEndMilliseconds) would be helpful. Also, looking back at the mapping snippet, it looks like the two field definitions should be nested inside a properties key under netflow. Maybe you could provide a more complete sample of that as well?

@weltenwort, sample message:

{
  "_index": "logstash2017.12.20",
  "_type": "logs",
  "_id": "AWBxk99JVIYU6tGVf-8W",
  "_version": 1,
  "found": true,
  "_source": {
    "@version": "1",
    "host": "10.xx.xx.xx",
    "netflow": {
      "flowset_id": 45896,
      "score": 100,
      "flow_duration": 1.601,
      "flowEndMilliseconds": "2017-12-20T01:39:59.621Z",
      "flow_av_in_throughput": 0,
      "version": 10,
      "flowEndSeconds": "2017-12-20T01:39:59.000Z",
      "flowStartMilliseconds": "2017-12-20T01:39:58.020Z",
      "hashTag": 2635236172,
      "end_date": "2017-12-20T01:39:59.621Z",
      "destinationTransportPort": 80,
      "flow_av_out_throughput": 739.5377888819488,
      "totalRtt": 0,
      "sourceTransportPort": 35388,
      "start_date": "2017-12-20T01:39:58.020Z",
      "octetTotalCount": 148,
      "proceraOutgoingDscp": 0,
      "refIncomingThroughput": 0,
      "trips": 1,
      "traffic_class": "Web Browsing",
      "flowStartSeconds": "2017-12-20T01:39:58.000Z"
    },
    "@timestamp": "2017-12-20T01:40:04.000Z",
    "tags": [
      "pre-web"
    ]
  }
}

and the mapping:

{
  "logstash2017.12.20" : {
    "mappings" : {
      "logs" : {
        "properties" : {
          "@timestamp" : {
            "type" : "date"
          },
          "@version" : {
            "type" : "text",
            "fields" : {
              "keyword" : {
                "type" : "keyword",
                "ignore_above" : 256
              }
            }
          },
          "host" : {
            "type" : "text",
            "fields" : {
              "keyword" : {
                "type" : "keyword",
                "ignore_above" : 256
              }
            }
          },
          "netflow" : {
            "properties" : {
              "destinationIPv4Address" : {
                "type" : "text",
                "fields" : {
                  "keyword" : {
                    "type" : "keyword",
                    "ignore_above" : 256
                  }
                }
              },
              "destinationTransportPort" : {
                "type" : "long"
              },
              "end_date" : {
                "type" : "date"
              },
              "flowEndMilliseconds" : {
                "type" : "date"
              },
              "flowEndSeconds" : {
                "type" : "date"
              },
              "flowStartMilliseconds" : {
                "type" : "date"
              },
              "flowStartSeconds" : {
                "type" : "date"
              },
              "flow_av_in_throughput" : {
                "type" : "float"
              },
              "flow_av_out_throughput" : {
                "type" : "float"
              },
              "flow_duration" : {
                "type" : "float"
              },
              "flowset_id" : {
                "type" : "long"
              },
              "hashTag" : {
                "type" : "long"
              },
              "octetTotalCount" : {
                "type" : "long"
              },
              "packetTotalCount" : {
                "type" : "long"
              },
              "protocolIdentifier" : {
                "type" : "long"
              },
              "refIncomingThroughput" : {
                "type" : "long"
              },
              "refOutgoingThroughput" : {
                "type" : "long"
              },
              "score" : {
                "type" : "float"
              },
              "sourceIPv4Address" : {
                "type" : "text",
                "fields" : {
                  "keyword" : {
                    "type" : "keyword",
                    "ignore_above" : 256
                  }
                }
              },
              "sourceTransportPort" : {
                "type" : "long"
              },
              "start_date" : {
                "type" : "date"
              },
              "totalRtt" : {
                "type" : "long"
              },
              "traffic_class" : {
                "type" : "text",
                "fields" : {
                  "keyword" : {
                    "type" : "keyword",
                    "ignore_above" : 256
                  }
                }
              },
              "trips" : {
                "type" : "long"
              },
              "version" : {
                "type" : "long"
              }
            }
          },
          "tags" : {
            "type" : "text",
            "fields" : {
              "keyword" : {
                "type" : "keyword",
                "ignore_above" : 256
              }
            }
          }
        }
      }
    }
  }
}

Which version of the elastic stack are you using exactly? I guess the .millis we removed earlier might be the problem.

@weltenwort we are using 5.6.3. I should be using seconds only then?

Ok, it seems to be a combination of a date type error and the fact that returning arrays in inline scripts does not work anymore. I have gotten it to work using the following scripted field definition though:

Here's the script code as text for copy+pasting (note the added toEpochMilli() call in line 6):

def stepSeconds = 60;
def start = Instant.ofEpochMilli(doc['netflow.flowStartMilliseconds'].value);
def end = Instant.ofEpochMilli(doc['netflow.flowEndMilliseconds'].value);
def dates = [];
for(def i = start; i.isBefore(end) || i.equals(end); i = i.plusSeconds(stepSeconds)) {
  dates.add(i.toEpochMilli());
}
return dates;

That way you can just use the new field in the visualization editor:

Let me know if that works for you.

1 Like

@weltenwort,
Thanks! That worked well. But is there any way to use a different value for stepSeconds. I have a field called netflow.flow_duration that I want to use instead of static value of 60;

I tried this and it didn't work:

def stepSeconds = doc['netflow.duration'].value;
def start = Instant.ofEpochMilli(doc['netflow.flowStartMilliseconds'].value);
def end = Instant.ofEpochMilli(doc['netflow.flowEndMilliseconds'].value);
def dates = ;
for(def i = start; i.isBefore(end) || i.equals(end); i = i.plusSeconds(stepSeconds)) {
dates.add(i.toEpochMilli());
}
return dates;