Indexing in elastic search using NEST returns 403 error

I have recently started learning/using elasticsearch and I am trying to index some documents using NEST (c# client) but after I index one document or run a bulk index the read_only_allow_delete is set to true in the index settings. So next call for indexing documents results in 403. If I set this to false again, it is set back to true again after the next call.

Does anybody have an idea why this is happening?
Have I set up the index wrong?

I know about the solution explained here but next time I index a document it is set back to true again :frowning:

Do the Elasticsearch logs indicate anything e.g. Elasticsearch enforcing a read-only index block (index.blocks.read_only_allow_delete: true) on every index that has one or more shards allocated on a node that has at least one disk exceeding the flood stage setting cluster.routing.allocation.disk.watermark.flood_stage, which defaults to 95% of disk space used?

Thanks. That was actually the problem. I re-read the document about read only allow delete flag and it'd mentioned the disk space issue. So clearing up some files did the trick .

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.