Elasticsearch: 'x_content_parse_exception' [template] unknown field [lifecycle]

I have elk stack deployed from deviantony/docker-elk version 8.9 . every component is 8.9

And i'm trying to create lifecycle with retention policy for my datastream as mentioned in official documentetion


PUT _index_template/my-index-template
{
  "index_patterns": ["my-data-stream*"],
  "data_stream": { },
  "priority": 500,
  "template": {
    "lifecycle": {
      "data_retention": "7d"
    }
  },
  "_meta": {
    "description": "Template with data stream lifecycle"
  }
}

but i got error

{
  "error": {
    "root_cause": [
      {
        "type": "x_content_parse_exception",
        "reason": "[6:5] [template] unknown field [lifecycle]"
      }
    ],
    "type": "x_content_parse_exception",
    "reason": "[6:18] [index_template] failed to parse field [template]",
    "caused_by": {
      "type": "x_content_parse_exception",
      "reason": "[6:5] [template] unknown field [lifecycle]"
    }
  },
  "status": 400
}

As well when i'm trying to get existing datasteam lifecycle, or create new lifecycle like this

GET _data_stream/logs-generic-default/_lifecycle
PUT _data_stream/my-data-stream/_lifecycle
{
  "data_retention": "30d"
}

I got similar error

{
  "error": "no handler found for uri [/_data_stream/my-data-stream/_lifecycle?pretty=true] and method [PUT]"
}

Any ideas?

Hi @anton3. This is an experimental new feature we've just added in 8.9, and it looks like we didn't correctly document how to enable it. Just to emphasize though, this is an unreleased experimental feature that is subject to change in future versions. If that's what you're looking for (and we would certainly appreciate any feedback on it), the way to enable it is to pass -Des.dlm_feature_flag_enabled=true as a JVM argument when you start your Elasticsearch nodes (all of them). Please don't run this in production since it is just a feature preview.
If you are just looking for the currently-supported way to handle retention in data streams, ILM is the recommended way right now (Set up a data stream | Elasticsearch Guide [8.9] | Elastic).

2 Likes

Thanks for replying, because I already lost hope :slight_smile:
One more question. If I apply lifecycle policy with delete phase for my datastream, is already existing data gonna be affected? If so, how fast these changes will affect existing data?

As I can see now, after applying lifetime policy old data was not deleted, though it's older than policy allows.

If I apply lifecycle policy with delete phase for my datastream, is already existing data gonna be affected? If so, how fast these changes will affect existing data?

Are you talking about the experimental lifecycle data_retention? I believe that that will delete older data the first time it runs, but I believe by default it only checks every 10 minutes. So you might have to wait 10 minutes to see your older data deleted. If that's not the case, please paste your data stream configuration here.

I'm referring to the recommended way you pointed.
F.e. autocreated stream logs-generic-default there is also index attached to it.
i have created my-index-template with lifecycle-policy-7days, as you can see it was applied to this stream

This is the configuration of lifecycle policy:

{
  "lifecycle-policy-7days": {
    "version": 3,
    "modified_date": "2023-08-08T07:11:34.540Z",
    "policy": {
      "phases": {
        "hot": {
          "min_age": "0ms",
          "actions": {
            "rollover": {
              "max_primary_shard_size": "50gb"
            }
          }
        },
        "delete": {
          "min_age": "7d",
          "actions": {
            "delete": {
              "delete_searchable_snapshot": true
            }
          }
        }
      }
    },
    "in_use_by": {
      "indices": [
        ".ds-logs-generic-default-2023.08.02-000001"
      ],
      "data_streams": [
        "logs-generic-default"
      ],
      "composable_templates": [
        "my-index-template"
      ]
    }
  }
}

Probably i'm doing something wrong, but i don't know what exactly, because i have logs older than 7 days and they weren't deleted. I mean i tried put "min_age": "3d" for delete phase, but nothing changed

Oh OK. I'm glad you went with the currently-supported option. You will probably get better responses if you open a new post for your latest question. There are a lot of ILM experts on this forum (I am not one of them). Most of them are probably not going to read a post with x_content_parse_exception in its title.
One thing to keep in mind with ILM though -- the delete min_age is the minimum amount of time after rollover before the index will be deleted. Since your data stream is fairly new, it has probably not been 7 days since the first data was rolled over into a new index.

1 Like

Agreed. Thank you for explanation anyway, i think thats enough for me :slight_smile:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.