Break up large index into multiple smaller equally sized indexes?

PUT _ilm/policy/split_large_index
{
  "policy": {
    "phases": {
      "hot": {
        "min_age": "0ms",
        "actions": {
          "rollover": {
            "max_size": "50gb",
            "max_age": "1d"
          },
          "set_priority": {
            "priority": 100
          }
        }
      },
      "warm": {
        "min_age": "0d",
        "actions": {}
      }
    }
  }
}

This is how I attached it to the index:

PUT packetbeat-7.10.0-2021.01.20-000099/_settings 
{
  "index": {
    "lifecycle": {
      "name": "split_large_index"
    }
  }
}

Based on that index name - packetbeat-7.10.0-2021.01.20-000099 - it would suggest that is already using ILM, which is the default for Beats, so that makes sense. So you won't be able to just attach the policy to the index like that, because it won't be being written into.
(I should have asked what sort of data it was to begin with!)

What is the actual age and size of that index?

1 Like

No, it isn't being written into.
The index covers 20/01/2021 through 27/01/2021, its about 250G with 2 replicas, so about 500G in total, 656 million documents. I need to remove the flow event.dataset from this. The rest of the documents are about 102 million.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.