Issue with plotting daily field values

Hi everyone, I'm relatively new to ES and Kibana. I'm currently searching how to plot field values, wich consists of a float value (y-axis), corresponding to a time (x-axis).

As I understand it, it's not possible to do so value by value without aggregation, but I actually have one value by day, so I found that it could be possible to set the y-axis to average, and set the x-axis to Date Histogram, so that it displays one average by day, that is, the value I want.

But I can't manage to do it. It displays multiple days, with the overall average for each point, and not each daily value (so that I have only a horizontal line, instead of a line displaying the values).

Is it possible to do what I want, and how? What am I missing?

As you pointed out already, you can use the date histogram with an interval setting of "daily" to just create one bucket per day. Nevertheless if you select a too large timerange we will use a larger bucket size, so your browser won't crash by drawing to complex charts. If that doesn't work for you a screenshot of your configuration and chart could be helpful.

If you are really looking into visualizing individual documents, because you know you only have very few of them (daily documents), you can have a look into creating a scatterplot via Vega, though Vega visualizations bring their own grammar and thus are a bit more advanced topic.


Thank you for your answer, in fact I resolved the problem myself ^^'

I was inserting the data by saving a list of python dictionaries : [{'date':'2018-01-01'},{'value':6000.0}, etc.], but now I'm inserting the key/value couples separatly, so they are different "hits" in ES, and now it works.

Problem is, the insertion is very slow, about 5 min for ~2000 key-value pairs, so it seems strange. Do you know a good way to insert key/value pairs with Python Elasticsearch DSL so that it would be more efficient?

Hi Thomas,

I think if you are needing 5 minutes for inserting 2000 documents, there is definitely room for improvement :smiley:

Are you inserting every document separately using the save method from the API? I think that will trigger separate requests for each document which would be really slow. You should instead use the bulk API in the python client instead: to insert multiple documents with one request

1 Like

That's what I thought :smiley:

Yes i call the save method for every document, plus the init() and method defining the fields with mapping configuration, I suspected it was not the best way ^^

Thanks I didn't know that, I'll look at it!

Hi @timroes, I tried the method with the bulk API this morning, and indeed this resolved the issue. Now all documents are inserted in a few seconds, instead of several minutes :slight_smile:

That sounds way more like I would have it expected :slight_smile:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.