How should I index the 100 billion log


I have multi services which generating at least 2 million of log message per day, I’m using logstash to form a doc and I create new index for every new day log, but I’m feeling it should have better way to handle it? Should I put all the log into one index regardless how many day? And make it 30 shards

No, use time based indices.

Hi Markolm,

If I just want to have one month logs rotation, Should my logstash ouput like this :
elasticsearch {
hosts => ""
manage_template => false
index => "appslog-%{+dd}"

Almost, you just need to adjust the time parameter, see

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.