I’m not very familiar yet with ELK/SIEM and I‘m also not sure about the proper category here.
So we have like thousands of filebeat ingest pipelines. I assume they’re auto created during updates. Is it safe to delete old or even all ones and let it recreate needed ones? Is there something to clean those up, any hints or tips?
Or would that usually be done by the ILM? Because those policies doesn’t seem to work. They’re all stuck in the rollover / hot phase for years already and I don’t see what’s blocking / happening there
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.