I have a three node cluster running in my homelab. It is a virtualised environment (ESXi) with Ubuntu server OS (LTS) & two nodes being data nodes & the third one being a voting only node. Both the data nodes run a Kibana instance. Underlying hardware is SSDs with segregation for OS disk and elasticsearch log storage disk. I usually fail to have logs being exported with two prominent errors:
Error: Failed to decrypt report job data. Please ensure that xpack.reporting.encryptionKey is set and re-generate this report. Error: Unsupported state or unable to authenticate data
A. I only get this error when both the Kibana instances are running. If I manually stop (service kibana stop) Kibana on one of the VMs, I am able to export logs successfully.
B. I found configuration inconsistencies between the Kibana instances. I've fixed this and restarted the services.
C. None of the Kibana instances have xpack.reporting.encrpyitonKey set. I am unaware as to what this should be. Should it be same as - xpack.encryptedSavedObjects.encryptionKey?
I've set xpack.reporting.csv.maxSizeBytes: 100mb, yet most of the reports time out or get error "data too large" - What is the best way to overcome this?
A. I'm a student and I have been collecting data for over a year. The indice size is over 1 TB. I need to export some of this data for my rerpoting. Without export entire year of collection may go to waste.