Inconsistent content in data table visualization

Hi,
I’m trying to work with data table visualization to show some error logs that occurred during tests I run, but I’m having problems with showing all the data.

Basically I need all my tables to be sorted by date (e2e_test_start_time field), and that no row will be missing (if it exists, and fits my date range, it should be showing).

But the content is inconsistent, sometimes there are missing rows, and not all relevant data is shown.

For example,
I set a 4-day date range: 2019-04-18 00:00:00.000 - 2019-04-21 23:59:59.999
Order on e2e_test_start_time field is set to Descending.
There were many logs that occurred during this time, but it only shows 2 entries:

Same 4-day date range, I changed order to Ascending, no data at all (I would love to know why did this happen):

Changing back to Descending, and changing the date range to a smaller one, 2 days range. Now I get more results:

Please help me in understanding what am I doing wrong,

  1. why is larger date range shows less data? (even though the limit is 200, and it only showed 2 entries)
  2. how can I make it present a sorted data by date?

Thanks,
Ziv

Hello @zivklara,
thanks for your question.

This is a strange behaviour, but before filing an bug ticket I'd like to ask you few more question on that:

  • on the first example (4 days, descend order, 200 size) have you tried to hit the refresh button on the right of the query bar? if so, can you please post here the request and response that are available on the inspector panel (top bar inspect -> View: requests) (please remove any sensible data from that) I'd like to check if the inconsistence is on the query or on the visualization itself.

  • can you flag Group other values in separate bucket and apply the changes to see if, at least, we can list the fact that we have other documents under the same time range?

Thanks for your patience
Marco

Hi, thanks for the response :slight_smile:

Regarding #2, I’ll have to check that later, and will update later today or tomorrow.
But in the 3rd example, you can see documents that are in the range (April 19), that are not shown in the first example (range was April 18-21).

Now for #1, Error counter increases by 1 every time I mouse click (on Refresh, on the arrow near ‘Split Rows’ and anywhere on the screen):

Request and response from when I clicked on refresh (I hope that’s what you meant, if not, please let me know and I’ll try to get it right):

Request:
POST /elasticsearch/_msearch HTTP/1.1 Host: e2e-kibana Connection: keep-alive Content-Length: 1614 Accept: application/json, text/plain, / Origin: http://e2e-kibana kbn-version: 6.5.2 User-Agent: Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/73.0.3683.103 Safari/537.36 content-type: application/x-ndjson Referer: http://e2e-kibana/app/kibana Accept-Encoding: gzip, deflate Accept-Language: he-IL,he;q=0.9,en-US;q=0.8,en;q=0.7 Cookie: io=lzdVqfCvAPQ8HWFRAAAI

Parsed:
Accept: application/json, text/plain, /
Accept-Encoding: gzip, deflate
Accept-Language: he-IL,he;q=0.9,en-US;q=0.8,en;q=0.7
Connection: keep-alive
Content-Length: 1614
content-type: application/x-ndjson
Cookie: io=lzdVqfCvAPQ8HWFRAAAI
Host: e2e-kibana
kbn-version: 6.5.2
Origin: http://e2e-kibana
Referer: http://e2e-kibana/app/kibana
User-Agent: Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/73.0.3683.103 Safari/537.36

Response:
HTTP/1.1 200 OK warning: 299 Elasticsearch-6.5.1-8c58350 "Deprecated field [inline] used, expected [source] instead" "Mon, 29 Apr 2019 10:15:20 GMT", 299 Elasticsearch-6.5.1-8c58350 "Deprecated field [valueType] used, expected [value_type] instead" "Mon, 29 Apr 2019 10:15:20 GMT", 299 Elasticsearch-6.5.1-8c58350 "returning default values for missing document values is deprecated. Set system property '-Des.scripting.exception_for_missing_value=true' to make behaviour compatible with future major versions!" "Mon, 29 Apr 2019 10:15:20 GMT" content-type: application/json; charset=UTF-8 kbn-name: kibana kbn-xpack-sig: c27be63268dc2af63250e0931086a350 cache-control: no-cache vary: accept-encoding content-encoding: gzip connection: close Date: Mon, 29 Apr 2019 10:15:20 GMT Transfer-Encoding: chunked

Parsed:
cache-control: no-cache
connection: close
content-encoding: gzip
content-type: application/json; charset=UTF-8
Date: Mon, 29 Apr 2019 10:15:20 GMT
kbn-name: kibana
kbn-xpack-sig: c27be63268dc2af63250e0931086a350
Transfer-Encoding: chunked
vary: accept-encoding
warning: 299 Elasticsearch-6.5.1-8c58350 "Deprecated field [inline] used, expected [source] instead" "Mon, 29 Apr 2019 10:15:20 GMT", 299 Elasticsearch-6.5.1-8c58350 "Deprecated field [valueType] used, expected [value_type] instead" "Mon, 29 Apr 2019 10:15:20 GMT", 299 Elasticsearch-6.5.1-8c58350 "returning default values for missing document values is deprecated. Set system property '-Des.scripting.exception_for_missing_value=true' to make behaviour compatible with future major versions!" "Mon, 29 Apr 2019 10:15:20 GMT"

Thanks,
for the request and response I mean a different Kibana feature. You can find a query inspector on the top bar of your Kibana app, in red on this screenshot:

Regarding the error, just as a test, could you please disable all the chrome extension you have? seems to be something caused by that, but I'm just triaging the root cause. Thanks

Ok, I think I got it right this time...

  • response is around 4000 lines, and i don't see an option to upload a file in here, so i'm adding here links to downloading request and response files:
    request: https://ufile.io/kp3jmj27
    response: https://ufile.io/6j7iwl46

  • I flagged 'Group other values...', and then I got more lines, but with 'Other' instead of date:

  • Regarding the errors, I disabled all of my extensions (except one where the option to disable is disabled), and I still got those errors on every mouse click I made.

Thanks

So the response seems fine, so this should be an issue on Kibana. I will take a look at the existing issues and see of there is any. I will try to test it with the same data also on my end.

Thanks

Hi, did you find the root cause? is it really a bug, or another issue?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.