Not able to save searches and can't edit in anything kibana. it gives me error forbidden error 403
More information would be helpful. Logs, screenshots , steps , version? Is there a proxy in between ?
waiting for your response
It looks like you are running out of disk space and have exceeded the flood level watermark which causes indices to become read-only. Free up disk space or add resources to get below the watermark. Then Use the APIs to make indices writeable.
how to free up the disk space ? is there is any limit of storing logs on elastic search in basic license?
To protect the system indices are made read only once less than 5% of disk space remains. How much you can store therefore depends on how much disk you have available, and is not limited by license.
You can resolve this by deleting indices or add capacity to the cluster.
I can't able to delete any indices or anything.it gives me the same error and I have 500 GB of storage and there is lots of space available
If you are using Linux, what is the output of
df -k? What does your
elasticsearch.yml file look like?
if there is no limit on storage for pushing logs in elasticsearch than why in monitoring tab of kibana showing me this
I can not see much in those pictures. Please take a proper screenshot or copy and paste text instead of a picture of a screen if you need to share this type of information.
From what I can see it seems you are below 85% usage for all mount points. Have you used the APIs to make the indices writable? What are the current settings of e.g. the .kibana index?
No i didn't use API's to make the indices writable. can you please tell me how to show you the kibana settings and please tell how to resolve the problem of storage if there is no limit on pushing data in elasticsearch.
You need to use the API once you have freed up space as described in the docs I linked to.
In which docs?
Seems I missed adding the link. Have edited my earlier post.
Facing diffculties can you plaese tell me how to use system storage for elasticsearch
Thankyou in Advance
I do not understand your question. Did you use the API I linked to? What exact problem are you facing?
Look I have space of 500 GB where my logs are saved in system
so i want to load more logs in the elastic search but i think there is no space left in elasticsearch nodes .
So how can i use the system space for transfering logs to kibana
Is there anything apart from Elasticsearch data that is taking up space on that partition? How much of that 500GB is available to Elasticsearch?
One thing that can take up space without being seen in index statistics is the transaction log. This is used to speed up recovery within a cluster, but if you are running just a single node you can change the settings and keep less around for a shorter time period and save space that way.