Jan 1, reindex, implement ILM on 2020 data, move forward into 2021

hi, i'm very new to ES.. i've been using it for about a year.. but have ALOT to learn.

i have built a "system" (i use that term loosely) that sends a large amount of log data via filebeat to logstash for some enrichment, then on to elasticsearch for storage. I have a years worth of logs in indexes broken up by the projectname, type of log, jobsite, then yea, month for example:


after a year of learning, this is not the best approach.. from my reading and research.. i'd like to implement some ILM policies, and move all of this data from last year off my "hot" nodes onto my "warm" nodes.

my logstash conf for output is:

output {
if [type] != "P0f" or [type] != "Suricata" or [site] !="ids" {
elasticsearch {
hosts => [""]
index => "filebeat-lookout-%{[site]}-%{+yyyy.MM}"
} #es
}#end if
if [type] == "P0f" {
elasticsearch {
hosts => [""]
index => "filebeat-lookout-p0f-%{[site]}-%{+yyyy.MM}"
} #es
}#end if

if [type] == "Suricata" {
elasticsearch {
hosts => [""]
index => "filebeat-lookout-suricata-%{[site]}-%{+yyyy.MM}"
} #es
}#end if

so its my understanding, i need to create new index templates, with the proper mapping and add an alias to them.
so i would have aliases like:

  • lighthouse-p0f
  • lighthouse-suricata
  • lighthouse-main-logs

once i have these in place, i woudl set up my ILM rules:

  • enable rollover
  • 50gb or 30 days
  • 90 days
  • node attribute: box_type:warm
  • set replicas : 1
  • strink : 9 shards

once i do that.. is where i get confused..
I will need to do two things:

  • send all new data to the alias's, so they get the ILM rules
  • reindex all old data so they follow the ILM rules?

how i do push all my old indexes through this?
thank you


Ideally, but you don't need to.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.