AutoDelete Index

Hi. I have following settings elastic in my Docker File.

environment:
      - node.name=es01
      - cluster.name=es-docker-cluster
      - discovery.type=single-node
      - bootstrap.memory_lock=true
      - "ES_JAVA_OPTS=-Xms512m -Xmx512m"
      - cluster.routing.allocation.disk.threshold_enabled=false
      - xpack.security.enabled=false
      - xpack.ml.enabled=false
      - ingest.geoip.downloader.enabled=false

I disabled xpack.ml because it drop my index without reason (I saw it in the logs)

But it didn't completely solve my problem.

All 3 indexes have been deleted....)

I go to log and see it

transeu_elasticsearch-transeu.1.i4i6izb5bkta@api | {"@timestamp":"2022-11-02T11:11:57.638Z", "log.level": "INFO", "message":"[cargos/0BpXXkAAT3GIvlkKvnbxKA] deleting index", "ecs.version": "1.2.0","service.name":"ES_ECS","event.dataset":"elasticsearch.server","process.thread.name":"elasticsearch[es01][masterService#updateTask][T#1]","log.logger":"org.elasticsearch.cluster.metadata.MetadataDeleteIndexService","elasticsearch.cluster.uuid":"VnjYfrjmS5meztXm9MCzQg","elasticsearch.node.id":"QaA2lOiTST-zFLJq_nCD3w","elasticsearch.node.name":"es01","elasticsearch.cluster.name":"es-docker-cluster"}
transeu_elasticsearch-transeu.1.i4i6izb5bkta@api | {"@timestamp":"2022-11-02T11:11:57.944Z", "log.level": "INFO", "message":"[companies/n8CGAmglTf-Bk4Cslr4_cg] deleting index", "ecs.version": "1.2.0","service.name":"ES_ECS","event.dataset":"elasticsearch.server","process.thread.name":"elasticsearch[es01][masterService#updateTask][T#1]","log.logger":"org.elasticsearch.cluster.metadata.MetadataDeleteIndexService","elasticsearch.cluster.uuid":"VnjYfrjmS5meztXm9MCzQg","elasticsearch.node.id":"QaA2lOiTST-zFLJq_nCD3w","elasticsearch.node.name":"es01","elasticsearch.cluster.name":"es-docker-cluster"}
transeu_elasticsearch-transeu.1.i4i6izb5bkta@api | {"@timestamp":"2022-11-02T11:11:58.139Z", "log.level": "INFO", "message":"[read_me/T4aiFcElTCKbWHywzLoVqg] deleting index", "ecs.version": "1.2.0","service.name":"ES_ECS","event.dataset":"elasticsearch.server","process.thread.name":"elasticsearch[es01][masterService#updateTask][T#1]","log.logger":"org.elasticsearch.cluster.metadata.MetadataDeleteIndexService","elasticsearch.cluster.uuid":"VnjYfrjmS5meztXm9MCzQg","elasticsearch.node.id":"QaA2lOiTST-zFLJq_nCD3w","elasticsearch.node.name":"es01","elasticsearch.cluster.name":"es-docker-cluster"}
transeu_elasticsearch-transeu.1.i4i6izb5bkta@api | {"@timestamp":"2022-11-02T11:11:58.330Z", "log.level": "INFO", "message":"[transports/COqWwXALRxqMsCSmIW9W0g] deleting index", "ecs.version": "1.2.0","service.name":"ES_ECS","event.dataset":"elasticsearch.server","process.thread.name":"elasticsearch[es01][masterService#updateTask][T#1]","log.logger":"org.elasticsearch.cluster.metadata.MetadataDeleteIndexService","elasticsearch.cluster.uuid":"VnjYfrjmS5meztXm9MCzQg","elasticsearch.node.id":"QaA2lOiTST-zFLJq_nCD3w","elasticsearch.node.name":"es01","elasticsearch.cluster.name":"es-docker-cluster"}

What is it? Elasticsearch emoved all indexes.

I checked my policy.

{"90-days-default":{"version":1,"modified_date":"2022-10-18T14:15:30.649Z","policy":{"phases":{"warm":{"min_age":"2d","actions":{"shrink":{"number_of_shards":1},"forcemerge":{"max_num_segments":1}}},"cold":{"min_age":"30d","actions":{}},"hot":{"min_age":"0ms","actions":{"rollover":{"max_primary_shard_size":"50gb","max_age":"30d"}}},"delete":{"min_age":"90d","actions":{"delete":{"delete_searchable_snapshot":true}}}},"_meta":{"managed":true,"description":"built-in ILM policy using the hot, warm, and cold phases with a retention of 90 days"}},"in_use_by":{"indices":[],"data_streams":[],"composable_templates":[]}},"ilm-history-ilm-policy":{"version":1,"modified_date":"2022-10-18T14:15:29.935Z","policy":{"phases":{"hot":{"min_age":"0ms","actions":{"rollover":{"max_primary_shard_size":"50gb","max_age":"30d"}}},"delete":{"min_age":"90d","actions":{"delete":{"delete_searchable_snapshot":true}}}},"_meta":{"managed":true,"description":"default policy for the ILM history indices"}},"in_use_by":{"indices":[],"data_streams":[],"composable_templates":["ilm-history"]}},"watch-history-ilm-policy-16":{"version":1,"modified_date":"2022-10-18T14:15:29.813Z","policy":{"phases":{"hot":{"min_age":"0ms","actions":{"rollover":{"max_primary_shard_size":"50gb","max_age":"3d"}}},"delete":{"min_age":"4d","actions":{"delete":{"delete_searchable_snapshot":true}}}},"_meta":{"managed":true,"description":"default policy for the watcher history indices"}},"in_use_by":{"indices":[],"data_streams":[],"composable_templates":[".watch-history-16"]}},"30-days-default":{"version":1,"modified_date":"2022-10-18T14:15:30.524Z","policy":{"phases":{"warm":{"min_age":"2d","actions":{"shrink":{"number_of_shards":1},"forcemerge":{"max_num_segments":1}}},"hot":{"min_age":"0ms","actions":{"rollover":{"max_primary_shard_size":"50gb","max_age":"30d"}}},"delete":{"min_age":"30d","actions":{"delete":{"delete_searchable_snapshot":true}}}},"_meta":{"managed":true,"description":"built-in ILM policy using the hot and warm phases with a retention of 30 days"}},"in_use_by":{"indices":[],"data_streams":[],"composable_templates":[]}},"synthetics":{"version":1,"modified_date":"2022-10-18T14:15:30.140Z","policy":{"phases":{"hot":{"min_age":"0ms","actions":{"rollover":{"max_primary_shard_size":"50gb","max_age":"30d"}}}},"_meta":{"managed":true,"description":"default policy for the synthetics index template installed by x-pack"}},"in_use_by":{"indices":[],"data_streams":[],"composable_templates":["synthetics"]}},".monitoring-8-ilm-policy":{"version":1,"modified_date":"2022-10-18T14:15:30.939Z","policy":{"phases":{"warm":{"min_age":"0ms","actions":{"forcemerge":{"max_num_segments":1}}},"hot":{"min_age":"0ms","actions":{"rollover":{"max_primary_shard_size":"50gb","max_age":"3d"}}},"delete":{"min_age":"3d","actions":{"delete":{"delete_searchable_snapshot":true}}}},"_meta":{"defaults":{"delete_min_age":"Using value of [3d] based on the monitoring plugin default"},"description":"Index lifecycle policy generated for [monitoring-*-8] data streams"}},"in_use_by":{"indices":[],"data_streams":[],"composable_templates":[".monitoring-beats-mb",".monitoring-kibana-mb",".monitoring-ent-search-mb",".monitoring-es-mb",".monitoring-logstash-mb"]}},"slm-history-ilm-policy":{"version":1,"modified_date":"2022-10-18T14:15:29.691Z","policy":{"phases":{"hot":{"min_age":"0ms","actions":{"rollover":{"max_primary_shard_size":"50gb","max_age":"30d"}}},"delete":{"min_age":"90d","actions":{"delete":{"delete_searchable_snapshot":true}}}},"_meta":{"managed":true,"description":"default policy for the SLM history indices"}},"in_use_by":{"indices":[],"data_streams":[],"composable_templates":[".slm-history"]}},".fleet-actions-results-ilm-policy":{"version":1,"modified_date":"2022-10-18T14:15:31.083Z","policy":{"phases":{"hot":{"min_age":"0ms","actions":{"rollover":{"max_size":"300gb","max_age":"30d"}}},"delete":{"min_age":"90d","actions":{"delete":{"delete_searchable_snapshot":true}}}},"_meta":{"managed":true,"description":"default policy for fleet action results indices"}},"in_use_by":{"indices":[],"data_streams":[],"composable_templates":[]}},"365-days-default":{"version":1,"modified_date":"2022-10-18T14:15:30.845Z","policy":{"phases":{"warm":{"min_age":"2d","actions":{"shrink":{"number_of_shards":1},"forcemerge":{"max_num_segments":1}}},"cold":{"min_age":"30d","actions":{}},"hot":{"min_age":"0ms","actions":{"rollover":{"max_primary_shard_size":"50gb","max_age":"30d"}}},"delete":{"min_age":"365d","actions":{"delete":{"delete_searchable_snapshot":true}}}},"_meta":{"managed":true,"description":"built-in ILM policy using the hot, warm, and cold phases with a retention of 365 days"}},"in_use_by":{"indices":[],"data_streams":[],"composable_templates":[]}},"7-days-default":{"version":1,"modified_date":"2022-10-18T14:15:30.223Z","policy":{"phases":{"warm":{"min_age":"2d","actions":{"shrink":{"number_of_shards":1},"forcemerge":{"max_num_segments":1}}},"hot":{"min_age":"0ms","actions":{"rollover":{"max_primary_shard_size":"50gb","max_age":"7d"}}},"delete":{"min_age":"7d","actions":{"delete":{"delete_searchable_snapshot":true}}}},"_meta":{"managed":true,"description":"built-in ILM policy using the hot and warm phases with a retention of 7 days"}},"in_use_by":{"indices":[],"data_streams":[],"composable_templates":[]}},".deprecation-indexing-ilm-policy":{"version":1,"modified_date":"2022-10-18T14:15:29.877Z","policy":{"phases":{"hot":{"min_age":"0ms","actions":{"rollover":{"max_primary_shard_size":"10gb","max_age":"30d"}}},"delete":{"min_age":"30d","actions":{"delete":{"delete_searchable_snapshot":true}}}},"_meta":{"managed":true,"description":"ILM policy used for deprecation logs"}},"in_use_by":{"indices":[],"data_streams":[],"composable_templates":[".deprecation-indexing-template"]}},"metrics":{"version":1,"modified_date":"2022-10-18T14:15:30.415Z","policy":{"phases":{"hot":{"min_age":"0ms","actions":{"rollover":{"max_primary_shard_size":"50gb","max_age":"30d"}}}},"_meta":{"managed":true,"description":"default policy for the metrics index template installed by x-pack"}},"in_use_by":{"indices":[],"data_streams":[],"composable_templates":["metrics"]}},"ml-size-based-ilm-policy":{"version":1,"modified_date":"2022-10-18T14:15:31.012Z","policy":{"phases":{"hot":{"min_age":"0ms","actions":{"rollover":{"max_primary_shard_size":"50gb"}}}},"_meta":{"managed":true,"description":"default policy for machine learning state and stats indices"}},"in_use_by":{"indices":[],"data_streams":[],"composable_templates":[".ml-state",".ml-stats"]}},"180-days-default":{"version":1,"modified_date":"2022-10-18T14:15:30.732Z","policy":{"phases":{"warm":{"min_age":"2d","actions":{"shrink":{"number_of_shards":1},"forcemerge":{"max_num_segments":1}}},"cold":{"min_age":"30d","actions":{}},"hot":{"min_age":"0ms","actions":{"rollover":{"max_primary_shard_size":"50gb","max_age":"30d"}}},"delete":{"min_age":"180d","actions":{"delete":{"delete_searchable_snapshot":true}}}},"_meta":{"managed":true,"description":"built-in ILM policy using the hot, warm, and cold phases with a retention of 180 days"}},"in_use_by":{"indices":[],"data_streams":[],"composable_templates":[]}},"logs":{"version":1,"modified_date":"2022-10-18T14:15:30.065Z","policy":{"phases":{"hot":{"min_age":"0ms","actions":{"rollover":{"max_primary_shard_size":"50gb","max_age":"30d"}}}},"_meta":{"managed":true,"description":"default policy for the logs index template installed by x-pack"}},

I don't understand reasons auto-deleting

Is your Elasticsearch exposed to the public internet?

It looks like that it is exposed to the public internet, and since you explicitily set xpack.security.enabled to false, then everyone would be able to find and access your cluster.

From your logs it looks like someone or something else is deleting your index.

1 Like

Today We closed port 9200 for public internet. Now elasticsearch is only available on the local docker network. I am testing it solution now

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.