Export has limited columns


I hope this message finds the community members safe and healthy.

I'm having incomplete data when exporting via Kibana & the output is erratic.

I am carrying out the searches on the same index

  1. Total documents found are 22,174 with 115 fields (columns) the export is correct and shows all columns and rows. Total export size is 11648162 bytes.
  2. Total documents found are 3291 with 115 fields but in this export only two fields are being exported. Total export size is 63453 bytes. Only the time and _source columns are being exported. I am however able to see the data in discover tab.

In both the searches I get an warning: Your CSV contains characters that spreadsheet applications might interpret as formulas.

In both the searches there are no errors

"rawResponse": {
    "took": 1562,
    "timed_out": false,
    "_shards": {
      "total": 14,
      "successful": 14,
      "skipped": 0,
      "failed": 0

How could I diagnose this & get the full export.

There are total of 3 nodes in the cluster with 2 data nodes and 1 voting noted. Both the searches were exported via the second data node.

PS: If i combine the timeline I get an error during export : Error: Max attempts reached (3). Queue timeout reached.

Thank you very much

Hi :wave:

a couple questions:

  1. What Kibana version are you at?
  2. What is the difference between two searches? Filters / time range?
  3. Do you have any selected fields"in Discover for the 2nd search? If you do, then only selected fields with get into the report


Thank you very much for replying.

  1. Kibana version is 7.15 (entire Elastic stack is running 7.15)
  2. Both searches are for 3 months period. (January to March end and April to June end) - April to June works perfect even though it has less documents for the search query.
  3. No Specific fields are selected in the in discover tab.


I downloaded all of the data for one day (around 300,000 entires) from search one and I only got the two columns. :expressionless:

Is there other diagnostic log(s) that I can provide to check this further?

Hm, do you see all the columns in discover for January to March data?

Yes Sir! I can also expand on them and carry out searches on keywords in that data.

Just to confirm this isn't a csv software issue (sorry, I have to ask :D) have you tried to open a raw csv in a simple file editor and confirm that data is indeed missing?

I'll check with the team if there are other ideas :slight_smile:

Here is a test for January 2021

From 5th January 0000 HRS to 0005 HRS.

  1. Total documents - 4,362
  2. Output from query explorer:
"rawResponse": {
    "took": 89,
    "timed_out": false,
    "_shards": {
      "total": 14,
      "successful": 14,
      "skipped": 12,
      "failed": 0

Export result: Succeeded with all document and columns. If I try anything over like for one hour which has ~50,000 documents or one day which has ~1,20,000 documents it fails with error Error: Max attempts reached (3). Queue timeout reached.

Yes yes I opened the files in notepad++ :slight_smile:

I did a search for the month of January for a specific term and I got the data with columns.

Let me do more work on this and provide data to determine when it fails.

Could you try reloading the page just before generating the report?:

  1. Set needed time range
  2. Reload the page
  3. When results are loaded: share -> generate csv

It might be a client state bug :frowning:

These are steps to reproduce the bug:

How to reproduce:

  • Install Kibana
  • Install the sample kibana_sample_data_logs dataset
  • Go in Discover, kibana_sample_data_logs index pattern
  • Do not select any field
  • Save it as TEST
  • Go in Share / Generate CSV (A)
  • Download the CSV: Contains Timestamp & the whole JSON source
  • Save it as TEST again (without enabling create new...) without refreshing the page
  • Go in Share / Generate CSV (B)
  • Download the CSV: the source will be -

We've opened a bug: Overwriting a Saved Search introduces columns parameter with invalid values and affects CSV exports · Issue #113693 · elastic/kibana · GitHub

Hello, thank you very much for the updated. I wasn't able to reproduce it after I cleared the cache and even rebooted my laptop. If I do get the same error again, I will send logs.

Further, I do have Kibana logs enabled, should I send them to you or upload on the GitHub ticket?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.