ILM policy for filebeats through config file

I want to setup ILM through filebeat.yml I have following config


    # ====================== Index Lifecycle Management (ILM) ======================

    # Configure index lifecycle management (ILM). These settings create a write
    # alias and add additional settings to the index template. When ILM is enabled,
    # output.elasticsearch.index is ignored, and the write alias is used to set the
    # index name.

    # Enable ILM support. Valid values are true, false, and auto. When set to auto
    # (the default), the Beat uses index lifecycle management when it connects to a
    # cluster that supports ILM; otherwise, it creates daily indices.
    setup.ilm.enabled: auto

    # Set the prefix used in the index lifecycle write alias name. The default alias
    # name is 'filebeat-%{[agent.version]}'.
    setup.ilm.rollover_alias: 'filebeat-%{[agent.version]}'

    # Set the rollover index pattern. The default is "%{now/d}-000001".
    setup.ilm.pattern: "{now/d}-000001"

    # Set the lifecycle policy name. The default policy name is
    # 'beatname'.
    setup.ilm.policy_name: "filebeat-rollover-7-days"

    # The path to a JSON file that contains a lifecycle policy configuration. Used
    # to load your own lifecycle policy.
    #setup.ilm.policy_file:

    # Disable the check for an existing lifecycle policy. The default is true. If
    # you disable this check, set setup.ilm.overwrite: true so the lifecycle policy
    # can be installed.
    setup.ilm.check_exists: true

    # Overwrite the lifecycle policy at startup. The default is false.
    setup.ilm.overwrite: true

which creates policy with name filebeat-rollover-7-days but this policy has default configuration.

Then I created ilm-policy.json and updated setup.ilm.policy_file: /usr/share/filebeat/policy/ilm-policy.json in filebeat.yml

ilm-policy.json
  ilm-policy.json: |-
    {
    	"filebeat-rollover-7-days": {
    		"policy": {
    			"phases": {
    				"warm": {
    					"min_age": "5d",
    					"actions": {
    						"readonly": {},
    						"set_priority": {
    							"priority": 50
    						}
    					}
    				},
    				"cold": {
    					"min_age": "7d",
    					"actions": {
    						"freeze": {},
    						"set_priority": {
    							"priority": 0
    						}
    					}
    				},
    				"hot": {
    					"min_age": "0ms",
    					"actions": {
    						"rollover": {
    							"max_size": "50gb",
    							"max_age": "3d"
    						},
    						"set_priority": {
    							"priority": 100
    						}
    					}
    				},
    				"delete": {
    					"min_age": "10d",
    					"actions": {
    						"delete": {
    							"delete_searchable_snapshot": true
    						}
    					}
    				}
    			}
    		}
    	}
    }

Now I see following error

2021-07-01T20:52:29.956Z	ERROR	[publisher_pipeline_output]	pipeline/output.go:154	Failed to connect to backoff(elasticsearch(http://elasticsearch:9200)): Connection marked as failed because the onConnect callback failed: 400 Bad Request: {"error":{"root_cause":[{"type":"x_content_parse_exception","reason":"[1:2] [put_lifecycle_request] unknown field [filebeat-rollover-7-days]"}],"type":"x_content_parse_exception","reason":"[1:2] [put_lifecycle_request] unknown field [filebeat-rollover-7-days]"},"status":400}

What I am missing here ?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.