Best practice for indexing a daily log

hi,
i have a few log files that are created daily,
i wanted to know what is the best way to ingest them,
should i create a new index with the current date for each log file, or just add them all to one big index.
for example:
i have dca.log.16.01.2018.log
dca.log.17.01.2018.log
dca.log.18.01.2018.log
dca.log.19.01.2018.log

should i create a new index every day ? or just create a index called "dca logs" and just add everything there ?
is there a limit for records in one index ? is there any advantage for splitting indexes?

what is the right way ?
i'm a noob in elk , and still trying to figure these thing up.

thanks
David

How large are your logs and how long do you want to keep them in the cluster?

Using time-based indices is great for managing retention, and considered best practice for log data. You do however not want a lot of small indices and shards as that can be very inefficient. It may therefore be good to use e.g. monthly indices instead or daily if your daily data volumes are low. You can find further guidance in this blog post.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.