ELK It's broken, rerouting shards:high disk watermark exceeded on one or more nodes

my elk
elasticsearch, kibana, logstash, is up services
but elasticsearch log is :

>     [2018-07-21T15:53:40,353][WARN ][o.e.c.r.a.DiskThresholdMonitor] [_POGvYv] high disk watermark [90%] exceeded on [_POGvYvXQsK4h81xgievYw][_POGvYv][/var/lib/elasticsearch/nodes/0] free: 9.2gb[9.6%], shards will be relocated away from this node
>     [2018-07-21T15:54:11,044][WARN ][o.e.c.r.a.DiskThresholdMonitor] [_POGvYv] high disk watermark [90%] exceeded on [_POGvYvXQsK4h81xgievYw][_POGvYv][/var/lib/elasticsearch/nodes/0] free: 9.2gb[9.6%], shards will be relocated away from this node
>     [2018-07-21T15:54:11,044][INFO ][o.e.c.r.a.DiskThresholdMonitor] [_POGvYv] rerouting shards: [high disk watermark exceeded on one or more nodes]
>     [2018-07-21T15:54:41,555][WARN ][o.e.c.r.a.DiskThresholdMonitor] [_POGvYv] high disk watermark [90%] exceeded on [_POGvYvXQsK4h81xgievYw][_POGvYv][/var/lib/elasticsearch/nodes/0] free: 9.2gb[9.6%], shards will be relocated away from this node
>     [2018-07-21T15:55:12,043][WARN ][o.e.c.r.a.DiskThresholdMonitor] [_POGvYv] high disk watermark [90%] exceeded on [_POGvYvXQsK4h81xgievYw][_POGvYv][/var/lib/elasticsearch/nodes/0] free: 9.2gb[9.6%], shards will be relocated away from this node
>     [2018-07-21T15:55:12,043][INFO ][o.e.c.r.a.DiskThresholdMonitor] [_POGvYv] rerouting shards: [high disk watermark exceeded on one or more nodes]
>     [2018-07-21T15:55:42,703][WARN ][o.e.c.r.a.DiskThresholdMonitor] [_POGvYv] high disk watermark [90%] exceeded on [_POGvYvXQsK4h81xgievYw][_POGvYv][/var/lib/elasticsearch/nodes/0] free: 9.2gb[9.6%], shards will be relocated away from this node
>     [2018-07-21T15:56:13,214][WARN ][o.e.c.r.a.DiskThresholdMonitor] [_POGvYv] high disk watermark [90%] exceeded on [_POGvYvXQsK4h81xgievYw][_POGvYv][/var/lib/elasticsearch/nodes/0] free: 9.2gb[9.6%], shards will be relocated away from this node
>     [2018-07-21T15:56:13,214][INFO ][o.e.c.r.a.DiskThresholdMonitor] [_POGvYv] rerouting shards: [high disk watermark exceeded on one or more nodes]
>     [2018-07-21T15:56:43,879][WARN ][o.e.c.r.a.DiskThresholdMonitor] [_POGvYv] high disk watermark [90%] exceeded on [_POGvYvXQsK4h81xgievYw][_POGvYv][/var/lib/elasticsearch/nodes/0] free: 9.2gb[9.6%], shards will be relocated away from this node

and kibana error: No results found
What should I do
please guide me

I went ahead with this guide

But it did not work and did not work

this problem whit this qurey is fixed

curl -XPUT -H "Content-Type: application/json" http://127.0.0.1:9200/_all/_settings -d '{
  "index": {
    "blocks.read_only": false,
    "blocks.read_only_allow_delete": false
  }
}'

but, Just minutes are correct and will be disabled again

If you are running low on disk, that's a great safeguard instead of corrupting your indices.

You can disable or change the watermark but I'd do that only for tests and never for production.

May be start new nodes or remove old indices you don't need anymore?

root@l-elk1:/var/lib/elasticsearch/nodes/0/indices# du -sh .
84G 

and

root@l-elk1:/var/lib/elasticsearch/nodes/0/indices# ls -1 | wc -l
44

and

root@l-elk1:~# df -h .
Filesystem                  Size  Used Avail Use% Mounted on
/dev/mapper/temp1--vg-root   97G   88G  4.6G  96% /

or

root@l-elk1:/var/lib/elasticsearch/nodes/0# df -h .
Filesystem                  Size  Used Avail Use% Mounted on
/dev/mapper/temp1--vg-root   97G   88G  4.6G  96% /

This indicates that the hard disk space is full
/var/lib/elasticsearch/nodes/0/indices
If I erase everything that is here
Is it okay?

Yes. That's what I said.

No. Use the Elasticsearch DELETE index API.
First do a GET _cat/indices?v to know more about your existing data.

Thanks, Very Thanks
Result Command GET _cat/indices?v

> health status index               uuid                   pri rep docs.count docs.deleted store.size pri.store.size
> yellow open   filebeat-2018.06.14 YpHd2voBQyu17uiEHatcHw   5   1   10749637            0      4.6gb          4.6gb
> yellow open   filebeat-2018.07.06 TSmHUJ8tTyiinMulOa8Haw   5   1    4630600            0      1.9gb          1.9gb
> yellow open   filebeat-2018.06.28 qg-0dK_lRO6bqamWWpQ57g   5   1    4415895            0      1.8gb          1.8gb
> yellow open   filebeat-2018.07.11 7QrlFrjpTqGrTRAdBQeSMA   5   1    5702517            0      2.3gb          2.3gb
> yellow open   filebeat-2018.07.02 7XC-Ml_vS_2UEgs-l9yQgA   5   1    4349934            0      1.8gb          1.8gb
> yellow open   filebeat-2018.07.09 g_jfH5ndTQajSnUMypwjIQ   5   1    7776283            0      3.1gb          3.1gb
> yellow open   filebeat-2018.07.12 Js74qhbdT-yf03clR9i8iw   5   1    4948499            0        2gb            2gb
> yellow open   filebeat-2018.06.22 4yvoPBjaTKi3C6DdJdXjWQ   5   1    3593513            0      1.5gb          1.5gb
> yellow open   filebeat-2018.07.08 4woskikuRGeZcyygbqghwg   5   1    6560783            0      2.7gb          2.7gb
> yellow open   filebeat-2018.07.17 k91PN0_XTge9fVWbD4DgyQ   5   1          4            0     29.3kb         29.3kb
> yellow open   filebeat-2018.06.23 IB16ofUAShmxvomzLmkHRw   5   1    3631919            0      1.5gb          1.5gb
> yellow open   filebeat-2018.06.13 HwK2VXtETKyNkYznMCIv7g   5   1    3616644            0      1.6gb          1.6gb
> yellow open   filebeat-2018.07.04 aRo8w5KKQWSHuGPDJDVc2A   5   1    5367641            0      2.1gb          2.1gb
> yellow open   filebeat-2018.06.24 1vLFp_VRQs-KVo9TJKmzDg   5   1    3642272            0      1.5gb          1.5gb
> yellow open   filebeat-2018.06.12 mE2zOt_ZQpqbttV9wo8C_w   5   1    3707943            0      1.6gb          1.6gb
> yellow open   filebeat-2018.07.22 F-yOp0LyRBuZCE2Bqygw-A   5   1    2011979            0      869mb          869mb
> yellow open   filebeat-2018.06.27 ihwPduwUSA-NU1uCqLuFyg   5   1    4604393            0      1.9gb          1.9gb
> yellow open   filebeat-2018.07.15 QRJ-OlzRQRuTFTQBrcX42Q   5   1          4            0     29.4kb         29.4kb
> yellow open   filebeat-2018.06.29 IqZ8AeyvRpu8YuNyrtVTyQ   5   1    5208334            0      2.1gb          2.1gb
> yellow open   filebeat-2018.06.19 wUUnVCGiTo67XH8bcGwJvA   5   1    4118756            0      1.7gb          1.7gb
> yellow open   filebeat-2018.06.17 fd99FvAZToypkcqWkXIafA   5   1    4416221            0      1.8gb          1.8gb
> yellow open   filebeat-2018.07.07 18PDHQmFTteZ9M2Y0BoA_g   5   1    4807711            0      1.9gb          1.9gb
> yellow open   filebeat-2018.06.20 tfi-pmtET-i_ywpd-zKfyA   5   1    7576373            0        3gb            3gb
> yellow open   filebeat-2018.07.14 9PhpqrJrTVmk4BWYtRJpXg   5   1        181            0    337.5kb        337.5kb
> yellow open   filebeat-2018.06.25 T6tU5mE3RkKaKP-pClq7Iw   5   1    6822101            0      2.8gb          2.8gb
> yellow open   filebeat-2018.07.05 nsqnZlSxQi6VaKZGPo9kiQ   5   1    4716567            0      1.9gb          1.9gb
> yellow open   filebeat-2018.06.15 B6sIe6cYRreAzy15IUGFnA   5   1   12171763            0      5.1gb          5.1gb
> yellow open   filebeat-2018.06.10 ALsbhInmRXWadwImTK01uQ   5   1    4190062            0      1.8gb          1.8gb
> yellow open   filebeat-2018.06.30 GwzFwSgTRqKv5a6bYjJ4SQ   5   1    5588910            0      2.2gb          2.2gb
> yellow open   filebeat-2018.07.03 wiy-GoTrTRSht9FNJjBtow   5   1    6146243            0      2.5gb          2.5gb
> yellow open   filebeat-2018.06.26 TsnTjydvS5G3iOkhU4EmZg   5   1    5401745            0      2.2gb          2.2gb
> yellow open   filebeat-2018.06.11 6uotgT6ZROK9UPzWbMfu5g   5   1    3738753            0      1.6gb          1.6gb
> yellow open   filebeat-2018.07.10 DQGbJsmNThaCcA8SDZbtbA   5   1    6411475            0      2.6gb          2.6gb
> yellow open   filebeat-2018.06.16 RlWh6-G5RdClfUM9N4Y2nQ   5   1    4237840            0      1.7gb          1.7gb
> yellow open   filebeat-2018.06.21 WwcgVDb1TY2jlDgP3U5nkw   5   1    3784049            0      1.5gb          1.5gb
> yellow open   filebeat-2018.07.13 OVC9rb5ORBWzUE-_xFdLrg   5   1    4085789            0      1.6gb          1.6gb
> green  open   .kibana             imWFOejMScmzhZL4BhaITw   1   0         13            2     72.2kb         72.2kb
> yellow open   filebeat-2018.06.18 5_VXGDLxR5y8JPl5El6dTg   5   1    4691197            0      1.9gb          1.9gb
> yellow open   filebeat-2018.07.19 znqa-CQkQ4a2925cqZUwqQ   5   1          2            0     28.5kb         28.5kb
> yellow open   filebeat-2018.06.08 P8yaiyijTg27KQ9PcOBjMw   5   1       2001            0      1.8mb          1.8mb
> yellow open   filebeat-2018.06.09 LyuYqtCpT9S9MyPGR95gyw   5   1    7134933            0      3.2gb          3.2gb
> yellow open   filebeat-2018.07.01 L1PfHSHnTBGZZoTUfXst-g   5   1    4244428            0      1.7gb          1.7gb
> yellow open   filebeat-2018.07.21 4msB-8MFR-2TaUyfVKIV4w   5   1    3387498            0      1.4gb          1.4gb
> yellow open   filebeat-2018.06.01 RMWBQE_7TAeB7o07qrDmtA   5   1      79845            0     35.1mb         35.1mb

Is all these indexes important on the server?
And is it useful?
All of this takes the storage server,
And ELK does not work
You tell which one should be erased

Well I can't speak for yourself. I mean you are collecting data with filebeat.
I don't know how long you would like to keep your data in elasticsearch.

If you don't need the data anymore from June you can probably run a

DELETE filebeat-2018.06.*

BTW you probably have too many shards per node.

May I suggest you look at the following resources about sizing:

Probably one single shard per index is ok and as you seem to have only one node, no replica would be better.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.