How should I index the 100 billion log

Hi,

I have multi services which generating at least 2 million of log message per day, I’m using logstash to form a doc and I create new index for every new day log, but I’m feeling it should have better way to handle it? Should I put all the log into one index regardless how many day? And make it 30 shards

No, use time based indices.

Hi Markolm,

If I just want to have one month logs rotation, Should my logstash ouput like this :
elasticsearch {
hosts => "127.0.0.1:9200"
manage_template => false
index => "appslog-%{+dd}"
}

Almost, you just need to adjust the time parameter, see https://www.elastic.co/guide/en/logstash/current/plugins-outputs-elasticsearch.html#plugins-outputs-elasticsearch-index

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.