Some indexes showing values as null

I'm trying to pair up elasticsearch with grafana and the strange thing I can't diagnose is that some indexes work and some don't. All of the indexes have the same kind of data and mapping so I don't have a clue why they don't work. Here are a few details on the matter:

Some buckets do not have any data (doc_count: 0) so there is no value that makes sense as a return value for the inner numeric aggregation and elasticsearch returns null.

I don't understand.

{
    "query": {
        "query_string": {
            "query": "MAIN_hit_rate:*"
        }
    }
}

{
  "took" : 14,
  "timed_out" : false,
  "_shards" : {
    "total" : 5,
    "successful" : 5,
    "failed" : 0
  },
  "hits" : {
    "total" : 2102,
    "max_score" : 1.0,
    "hits" : [ { data here bla bla

With the grafana query it still shows as empty:

{"search_type":"count","ignore_unavailable":true,"index":"agslx-gpolevc00-*"}
{"size":0,"query":{"filtered":{"query":{"query_string":{"analyze_wildcard":true,"query":"*"}},"filter":{"bool":{"must":[{"range":{"@timestamp":{"gte":"1470859996178","lte":"1470860296178","format":"epoch_millis"}}}]}}}},"aggs":{"2":{"date_histogram":{"interval":"200ms","field":"@timestamp","min_doc_count":0,"extended_bounds":{"min":"1470859996178","max":"1470860296178"},"format":"epoch_millis"},"aggs":{"1":{"max":{"field":"MAIN_hit_rate"}}}}}}

Output: http://pastebin.com/64gqdR15

Apparently you do not have data in this range.

I did a

curl -XDELETE 'http://localhost:9200/*'

Deleted everything and started storing data again yet it shows nothing on my grafana. I'm checking the logstash debug output and using ES queries I see that data is stored. I don't understand how this is not working.

I'm getting the same problem with Kibana. I don't get it why my ES doesn't have data in those date ranges if my logstash should write it there.

Can you paste the output of curl -XGET 'http://localhost:9200/agslx-gpolevc00-*/_search to get an idea of what your docs look like and what kind of dates they have, and curl -XGET 'http://localhost:9200/_mapping' to see how your dates are indexed?

curl -XGET 'http://localhost:9200/agslx-gpolevc00-*/_search'

{"took":4,"timed_out":false,"_shards":{"total":10,"successful":10,"failed":0},"hits":{"total":685,"max_score":1.0,"hits":[{"_index":"agslx-gpolevc00-varnishstat","_type":"vc_server","_id":"AVZ2ZgDbtBGGExh6QA58","_score":1.0,"_source":{"@timestamp":"2016-08-10T13:49:05.064Z","type":"vc_server","hostname":"agslx-gpolevc00","MAIN_fetch_204":0.0,"tags":["varnishstat"]}},{"_index":"agslx-gpolevc00-varnishstat","_type":"vc_server","_id":"AVZ2ZgDbtBGGExh6QA5-","_score":1.0,"_source":{"@timestamp":"2016-08-10T13:49:05.064Z","type":"vc_server","hostname":"agslx-gpolevc00","MAIN_backend_fail":60.0,"tags":["varnishstat"]}},{"_index":"agslx-gpolevc00-varnishstat","_type":"vc_server","_id":"AVZ2ZgDbtBGGExh6QA6D","_score":1.0,"_source":{"@timestamp":"2016-08-10T13:49:05.064Z","type":"vc_server","hostname":"agslx-gpolevc00","MAIN_conn_diff":180.0,"tags":["varnishstat"]}},{"_index":"agslx-gpolevc00-varnishstat","_type":"vc_server","_id":"AVZ2ZgDbtBGGExh6QA6E","_score":1.0,"_source":{"@timestamp":"2016-08-10T13:49:05.064Z","type":"vc_server","hostname":"agslx-gpolevc00","MAIN_hit_rate":99.73,"tags":["varnishstat"]}},{"_index":"agslx-gpolevc00-varnishstat","_type":"vc_server","_id":"AVZ2bVVHtBGGExh6QBln","_score":1.0,"_source":{"@timestamp":"2016-08-10T13:57:05.081Z","type":"vc_server","hostname":"agslx-gpolevc00","MAIN_fetch_304":3611534.0,"tags":["varnishstat"]}},{"_index":"agslx-gpolevc00-varnishstat","_type":"vc_server","_id":"AVZ2bVVHtBGGExh6QBls","_score":1.0,"_source":{"@timestamp":"2016-08-10T13:57:05.081Z","type":"vc_server","hostname":"agslx-gpolevc00","MAIN_backend_conn":854222.0,"tags":["varnishstat"]}},{"_index":"agslx-gpolevc00-varnishstat","_type":"vc_server","_id":"AVZ2bVVHtBGGExh6QBlu","_score":1.0,"_source":{"@timestamp":"2016-08-10T13:57:05.081Z","type":"vc_server","hostname":"agslx-gpolevc00","MAIN_hit_rate":99.73,"tags":["varnishstat"]}},{"_index":"agslx-gpolevc00-varnishstat","_type":"vc_server","_id":"AVZ2ZvantBGGExh6QBAC","_score":1.0,"_source":{"@timestamp":"2016-08-10T13:50:05.066Z","type":"vc_server","hostname":"agslx-gpolevc00","MAIN_fetch_304":3606984.0,"tags":["varnishstat"]}},{"_index":"agslx-gpolevc00-varnishstat","_type":"vc_server","_id":"AVZ2ZvantBGGExh6QBAF","_score":1.0,"_source":{"@timestamp":"2016-08-10T13:50:05.066Z","type":"vc_server","hostname":"agslx-gpolevc00","MAIN_n_lru_nuked":0.0,"tags":["varnishstat"]}},{"_index":"agslx-gpolevc00-varnishstat","_type":"vc_server","_id":"AVZ2ZvantBGGExh6QBAJ","_score":1.0,"_source":{"@timestamp":"2016-08-10T13:50:05.066Z","type":"vc_server","hostname":"agslx-gpolevc00","MAIN_hit_rate":99.73,"tags":["varnishstat"]}}]}}

curl -XGET 'http://localhost:9200/agslx-gpolevc00-*/_mapping'

{"agslx-gpolevc00-varnishtop":{"mappings":{"vc_server":{"properties":{"@timestamp":{"type":"date","format":"strict_date_optional_time||epoch_millis"},"hostname":{"type":"string"},"parameter":{"type":"string","index":"not_analyzed"},"type":{"type":"string"},"value":{"type":"long"}}}}},"agslx-gpolevc00-varnishstat":{"mappings":{"vc_server":{"properties":{"@timestamp":{"type":"date","format":"strict_date_optional_time||epoch_millis"},"MAIN_backend_conn":{"type":"double"},"MAIN_backend_fail":{"type":"double"},"MAIN_conn_diff":{"type":"double"},"MAIN_fetch_1xx":{"type":"double"},"MAIN_fetch_204":{"type":"double"},"MAIN_fetch_304":{"type":"double"},"MAIN_fetch_failed":{"type":"double"},"MAIN_hit_rate":{"type":"double"},"MAIN_n_lru_nuked":{"type":"double"},"MAIN_sess_drop":{"type":"double"},"hostname":{"type":"string"},"tags":{"type":"string"},"type":{"type":"string"}}}}}}

The thing I don't understand is that the indexes that work have the same mapping since I use the same template for all my indexes because they all receive the same kind of data in the same format.

The mappings look fine.

Your date range filter looks for documents whose timestamp is between Wed Aug 10 2016 20:13:16 GMT+0000 and Wed Aug 10 2016 20:18:16 GMT+0000 (a tiny range!) but the first documents that come out of your index have earlier timestamps. Let's see what your most recent documents are by running:

curl -XGET 'http://localhost:9200/agslx-gpolevc00-*/_search -d '
{
  "sort": [{ "@timestamp" : {"order" : "desc"}}]
}
'

Yeah I chose to show only the last 5 minutes. Now I've changed it to last 1 hour and it still shows that I have empty buckets.

This is the result of the most recent documents

{"took":24,"timed_out":false,"_shards":{"total":10,"successful":10,"failed":0},"hits":{"total":1525,"max_score":null,"hits":[{"_index":"agslx-gpolevc00-varnishstat","_type":"vc_server","_id":"AVZ2wZuLtBGGExh6QJXE","_score":null,"_source":{"@timestamp":"2016-08-10T15:29:05.299Z","type":"vc_server","hostname":"agslx-gpolevc00","MAIN_fetch_1xx":0.0,"tags":["varnishstat"]},"sort":[1470842945299]},{"_index":"agslx-gpolevc00-varnishstat","_type":"vc_server","_id":"AVZ2wZuLtBGGExh6QJXK","_score":null,"_source":{"@timestamp":"2016-08-10T15:29:05.299Z","type":"vc_server","hostname":"agslx-gpolevc00","MAIN_sess_drop":0.0,"tags":["varnishstat"]},"sort":[1470842945299]},{"_index":"agslx-gpolevc00-varnishtop","_type":"vc_server","_id":"AVZ2wZuctBGGExh6QJXY","_score":null,"_source":{"@timestamp":"2016-08-10T15:29:05.299Z","type":"vc_server","parameter":"/tapa/2016/08/10/OLE_20160810_02_jpg","value":17.0,"hostname":"agslx-gpolevc00"},"sort":[1470842945299]},{"_index":"agslx-gpolevc00-varnishtop","_type":"vc_server","_id":"AVZ2wZuctBGGExh6QJXb","_score":null,"_source":{"@timestamp":"2016-08-10T15:29:05.299Z","type":"vc_server","parameter":"/","value":7.0,"hostname":"agslx-gpolevc00"},"sort":[1470842945299]},{"_index":"agslx-gpolevc00-varnishstat","_type":"vc_server","_id":"AVZ2wZuLtBGGExh6QJXH","_score":null,"_source":{"@timestamp":"2016-08-10T15:29:05.299Z","type":"vc_server","hostname":"agslx-gpolevc00","MAIN_backend_fail":60.0,"tags":["varnishstat"]},"sort":[1470842945299]},{"_index":"agslx-gpolevc00-varnishstat","_type":"vc_server","_id":"AVZ2wZuLtBGGExh6QJXG","_score":null,"_source":{"@timestamp":"2016-08-10T15:29:05.299Z","type":"vc_server","hostname":"agslx-gpolevc00","MAIN_fetch_304":3670846.0,"tags":["varnishstat"]},"sort":[1470842945299]},{"_index":"agslx-gpolevc00-varnishstat","_type":"vc_server","_id":"AVZ2wZuLtBGGExh6QJXM","_score":null,"_source":{"@timestamp":"2016-08-10T15:29:05.299Z","type":"vc_server","hostname":"agslx-gpolevc00","MAIN_conn_diff":139.0,"tags":["varnishstat"]},"sort":[1470842945299]},{"_index":"agslx-gpolevc00-varnishstat","_type":"vc_server","_id":"AVZ2wZuLtBGGExh6QJXN","_score":null,"_source":{"@timestamp":"2016-08-10T15:29:05.299Z","type":"vc_server","hostname":"agslx-gpolevc00","MAIN_hit_rate":99.73,"tags":["varnishstat"]},"sort":[1470842945299]},{"_index":"agslx-gpolevc00-varnishtop","_type":"vc_server","_id":"AVZ2wZuctBGGExh6QJXc","_score":null,"_source":{"@timestamp":"2016-08-10T15:29:05.299Z","type":"vc_server","parameter":"/?ondemand=2","value":7.0,"hostname":"agslx-gpolevc00"},"sort":[1470842945299]},{"_index":"agslx-gpolevc00-varnishstat","_type":"vc_server","_id":"AVZ2wZuLtBGGExh6QJXF","_score":null,"_source":{"@timestamp":"2016-08-10T15:29:05.299Z","type":"vc_server","hostname":"agslx-gpolevc00","MAIN_fetch_204":0.0,"tags":["varnishstat"]},"sort":[1470842945299]}]}}

These latest records are too old to match the grafana range. I suspect something is wrong with the dates of these documents, maybe a timezone issue in your indexing pipeline that makes eg. dates in the local time zone be interpreted as UTC?

If you ask Grafana to use a range of 1 day rather than 1 hour, you should be able to see data.