Kibana Groovy Scripted Field for Aggregation: Visualize: Unsupported script value

Hi,

I am trying to get the number of log entries per day of week and visualize it in a bar chart.

For development, I have enabled inline groovy scripted fields in elastic:
script.engine.groovy.inline.aggs: on

I have written an aggregation query and successfully tested it with sense.

{ "query": { "filtered": { "query": { "query_string": { "analyze_wildcard": true, "query": "*" } }, "filter": { "bool": { "must": [ { "range": { "@timestamp": { "gte": 1459461600000, "lte": 1462053599999, "format": "epoch_millis" } } } ], "must_not": [] } } } }, "size": 0, "aggs": { "2": { "terms": { "script": "doc['@timestamp'].date.dayOfWeek().getAsText()", "order": { "_term": "asc" } } } } }

Now, I would like to use the same type of dynamic calculation for the visualization:

  1. created a scripted field for the according index in kibana named 'custom.timestamp.dayofweek'
  2. created a new visualization with x axis type: "term", field: 'custom.timestamp.dayofweek'
  3. I have opened the advanced section for the bucket and typed in the following json to override the original dummy scripting of the scripted field:
    {
    "script": "doc['@timestamp'].date.dayOfWeek().getAsText()",
    "lang" : "groovy"
    }

As a result, I do not receive a bar chart according to the buckets returned by sense. Instead, I receive error messages such as
"Visualize: Unsupported script value [Donnerstag] Unsupported script value [Dienstag] Unsupported script value [Freitag]"

By this, I know, the calculation worked out as expected, but the Kibana Visualization does not work properly. I assume I have to change any configuration.

Does anyone has a hint for me how to fix that?

Thanks
Benjamin

Hi Benjamin,

Try changing this:

{
"script": "doc['@timestamp'].date.dayOfWeek().getAsText()",
"lang" : "groovy"
}

To this:

{
"script": "doc['@timestamp'].date.dayOfWeek().getAsText()",
"lang" : "groovy",
"valueType": "string"
}

By the way, I was able to get this working a slightly different way (but I don't think it is better or worse than yours):

When you create the scripted field, you can enter the script, doc['@timestamp'].date.dayOfWeek().getAsText() in the scripted field definition itself:

Then, in the visualization editor, in the advanced section for the terms aggregation on the scripted field, you only need to specify { "lang": "groovy", "valueType": "string" }:

Thanks a lot Shaunak!
Defining the valueType did the the job!

Now, there is only thing missing. Is there any chance to tight the buckets to a defined order?
The buckets are named according to the weekdays (monday, tuesday,...)
But the buckets are either ordered by term (i.e. friday, moday, saturday, subday, tuesday, thursday, wednesday)
or by count which is the number of items in the bucket,...
However, what the user would need is the natural order of the weekdays.

Do you got an idea hot wo solve that?

thanks
Benjamin

There might be a better way to achieve natural ordering of the weekdays but the best I could think of was this:

Change your script expression to doc['@timestamp'].date.dayOfWeek().get() + "_" + doc['@timestamp'].date.dayOfWeek().getAsText() and then in the visualization editor, order by term in ascending order. That'll show you the days in their natural order but as 1_Monday, 2_Tuesday, etc.

The leading digits are not perfect for the presentation but this works with a minor modification:

doc['@timestamp'].date.dayOfWeek().get()+'_'+doc['@timestamp'].date.dayOfWeek().getAsText()

I had to use single quotes surrounding the underscore as double quotes are used for the overall query string already:

{ "script": "doc['@timestamp'].date.dayOfWeek().get()+'_'+doc['@timestamp'].date.dayOfWeek().getAsText()", "lang" : "groovy", "valueType": "string" }

Thanks for your support!