Recent data missing in dashboard/visualizations

Hello,

I seem to have an issue with recent data not being available for visualizations or dashboards in Kibana. I have winlogbeats data from several servers that goes into my ELK stack. I have saved searches for this data that work fine and I can see/search the logs up to the minute.

However, when I try to create any visualizations using these saved searches, any data within about 3 days is not visible. If I use the timescale and put the range within 3 days I get no results found. If I move the time scale back to greater than 3 days I can see data.

I am able to search in the discovery tab using the fields that I am using for the visualizations so it seems like the fields are correct. This is a very weird issue and I am only having it with beats data. All my syslog, netflow, etc. works fine.

my environment:
2 - centos 7 ES nodes 6.2.2-1
1 - centos 7 ES client node 6.2.2-1 with kibana 6.2.2-1
1 - centos 7 logstash 6.2.2-1

It sounds to me like the data is the issue, and possibly it has something to do with how the beat is configured. Maybe the date is getting mangled somehow to be offset by three days? I would look into that, maybe asking the question in the Beats forum too to see if they can provide any insight. Maybe you could also post the saved searches and visualization config. Maybe there's an extra filter in there somewhere.

Here are the saved searches & results and the visualization & results. Truncated for length

1a. saved search request

{ "version": true, "size": 500, "sort": [ { "@timestamp": { "order": "desc", "unmapped_type": "boolean" } } ], "_source": { "excludes": [] }, "aggs": { "2": { "date_histogram": { "field": "@timestamp", "interval": "30s", "time_zone": "America/New_York", "min_doc_count": 1 } } }, "stored_fields": [ "*" ], "script_fields": {}, "docvalue_fields": [ "@timestamp", "event_data.DeviceTime", "event_data.NewTime", "event_data.OldTime", "event_data.StartTime", "event_data.StopTime", "user_data.UTCStartTime" ], "query": { "bool": { "must": [ { "match_all": {} }, { "bool": { "should": [ { "match_phrase": { "level": "Error" } }, { "match_phrase": { "level": "error" } } ], "minimum_should_match": 1 } }, { "bool": { "should": [ { "match_phrase": { "log_name": "Application" } }, { "match_phrase": { "log_name": "application" } } ], "minimum_should_match": 1 } }, { "range": { "@timestamp": { "gte": 1521157883853, "lte": 1521158783853, "format": "epoch_millis" } } } ], "filter": [], "should": [], "must_not": [] } }, "highlight": { "pre_tags": [ "@kibana-highlighted-field@" ], "post_tags": [ "@/kibana-highlighted-field@" ], "fields": { "*": {} }, "fragment_size": 2147483647 } }

1b. saved search response
{ "took": 1592, "hits": { "hits": [ { "_index": "logstash-winlogbeat-2018.03.16", "_type": "doc", "_id": "WWsgLGIBuvW35OQvOvFB", "_version": 1, "_score": null, "_source": { "@version": "1", "record_number": "71950", "event": { "host": "PRDEXCH003", "type": "wineventlog" }, "tags": [ "beat", "beats_input_codec_plain_applied" ], "node": { "hostname": "PRDEXCH003", "ipaddr": "PRDEXCH003" }, "task": "TransportService", "@timestamp": "2018-03-16T00:05:58.000Z", "event_id": 12014, "message": "Microsoft Exchange could not find a certificate that contains the domain name ...", "event_data": { "param2": "PRDEXCH002", "param1": "PRDEXCH003" }, "source_name": "MSExchangeTransport", "flow": { "geoip": { "autonomous_system": "private" } }, "log_name": "Application", "computer_name": "PRDEXCH003.bostonheartlab.local", "beat": { "version": "6.2.0", "hostname": "PRDEXCH003", "name": "PRDEXCH003" }, "keywords": [ "Classic" ], "level": "Error" }, "fields": { "@timestamp": [ "2018-03-16T00:05:58.000Z" ] }, "highlight": { "level": [ "@kibana-highlighted-field@Error@/kibana-highlighted-field@" ], "log_name": [ "@kibana-highlighted-field@Application@/kibana-highlighted-field@" ] }, "sort": [ 1521158758000 ] }, ], "total": 7, "max_score": 0 }, "aggregations": { "2": { "buckets": [ { "key_as_string": "2018-03-15T19:52:00.000-04:00", "key": 1521157920000, "doc_count": 1 }, { "key_as_string": "2018-03-15T19:53:30.000-04:00", "key": 1521158010000, "doc_count": 1 }, { "key_as_string": "2018-03-15T19:58:00.000-04:00", "key": 1521158280000, "doc_count": 3 }, { "key_as_string": "2018-03-15T19:58:30.000-04:00", "key": 1521158310000, "doc_count": 1 }, { "key_as_string": "2018-03-15T20:05:30.000-04:00", "key": 1521158730000, "doc_count": 1 } ] } } }

2a. visualization request
{ "size": 0, "aggs": { "2": { "terms": { "field": "event.host.keyword", "size": 5, "order": { "_count": "desc" } } } }, "version": true, "_source": { "excludes": [] }, "stored_fields": [ "*" ], "script_fields": {}, "docvalue_fields": [ "@timestamp", "event_data.DeviceTime", "event_data.NewTime", "event_data.OldTime", "event_data.StartTime", "event_data.StopTime", "user_data.UTCStartTime" ], "query": { "bool": { "must": [ { "match_all": {} }, { "match_all": {} }, { "bool": { "should": [ { "match_phrase": { "level": "Error" } }, { "match_phrase": { "level": "error" } } ], "minimum_should_match": 1 } }, { "bool": { "should": [ { "match_phrase": { "log_name": "Application" } }, { "match_phrase": { "log_name": "application" } } ], "minimum_should_match": 1 } }, { "range": { "@timestamp": { "gte": 1521158309543, "lte": 1521159209543, "format": "epoch_millis" } } } ], "filter": [], "should": [], "must_not": [] } }, "highlight": { "pre_tags": [ "@kibana-highlighted-field@" ], "post_tags": [ "@/kibana-highlighted-field@" ], "fields": { "*": {} }, "fragment_size": 2147483647 } }

2b visualization response
{ "took": 381, "timed_out": false, "_shards": { "total": 3695, "successful": 3695, "skipped": 3685, "failed": 0 }, "hits": { "total": 3, "max_score": 0, "hits": [] }, "aggregations": { "2": { "doc_count_error_upper_bound": 0, "sum_other_doc_count": 0, "buckets": [] } }, "status": 200 }

The difference I notice is that the range values are different for the query the visualization makes versus what the stored search uses. Maybe the timepicker is interfering ? Can you just omit the range part of the query?

Removing the range from the query gives results. However its the same amount of results as when I simply pick a long date (last 2 years) in the timepicker in the UI. That just tells me that its using the old data but still not matching on the newer data.

Here is when I removed the range
{ "took": 29107, "timed_out": false, "num_reduce_phases": 8, "_shards": { "total": 3856, "successful": 3856, "skipped": 0, "failed": 0 }, "hits": { "total": 26846, "max_score": 0, "hits": [] }, "aggregations": { "2": { "doc_count_error_upper_bound": 0, "sum_other_doc_count": 36, "buckets": [ { "key": "PRDEXCH001", "doc_count": 11606 }, { "key": "PRDEXCH002", "doc_count": 7099 }, { "key": "PRDEXCH003", "doc_count": 5548 }, { "key": "PRDDC100", "doc_count": 325 }, { "key": "PRDDC200", "doc_count": 86 } ] } } }

Here is when I just picked last 2 years in the timepicker

{ "size": 0, "aggs": { "2": { "terms": { "field": "event.host.keyword", "size": 5, "order": { "_count": "desc" } } } }, "version": true, "_source": { "excludes": [] }, "stored_fields": [ "*" ], "script_fields": {}, "docvalue_fields": [ "@timestamp", "event_data.DeviceTime", "event_data.NewTime", "event_data.OldTime", "event_data.StartTime", "event_data.StopTime", "user_data.UTCStartTime" ], "query": { "bool": { "must": [ { "match_all": {} }, { "match_all": {} }, { "bool": { "should": [ { "match_phrase": { "level": "Error" } }, { "match_phrase": { "level": "error" } } ], "minimum_should_match": 1 } }, { "bool": { "should": [ { "match_phrase": { "log_name": "Application" } }, { "match_phrase": { "log_name": "application" } } ], "minimum_should_match": 1 } }, { "range": { "@timestamp": { "gte": 1458139115131, "lte": 1521211115131, "format": "epoch_millis" } } } ], "filter": [], "should": [], "must_not": [] } }, "highlight": { "pre_tags": [ "@kibana-highlighted-field@" ], "post_tags": [ "@/kibana-highlighted-field@" ], "fields": { "*": {} }, "fragment_size": 2147483647 } }

RESPONSE
{ "took": 1218, "timed_out": false, "num_reduce_phases": 7, "_shards": { "total": 3695, "successful": 3695, "skipped": 121, "failed": 0 }, "hits": { "total": 26840, "max_score": 0, "hits": [] }, "aggregations": { "2": { "doc_count_error_upper_bound": 0, "sum_other_doc_count": 36, "buckets": [ { "key": "PRDEXCH001", "doc_count": 11606 }, { "key": "PRDEXCH002", "doc_count": 7099 }, { "key": "PRDEXCH003", "doc_count": 5548 }, { "key": "PRDDC100", "doc_count": 318 }, { "key": "PRDDC200", "doc_count": 86 } ] } }, "status": 200 }

@timroes @thomasneirynck any ideas on this one? I'm a little stumped as to the difference in behavior.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.