Elastic best Practice

Hello, I am indexing 1mil logs per 15 minutes in one index and I need about 10 000 of them to visualize in Grafana and make reports on it.
Is it good practice to separate these 10 000/15 mins to another index? Will searching be quicker when I rotate it daily and have a smaller shard size then?

What is them here?

I am indexing 1mil and need to visualize and use for reporting +- 10k of logs. But I must collect everything. Should I maybe duplicate these 10k logs to another index or split them to another index? Or it doesn't matter that I have 1mil logs and search only for 10k of them.
1 mil is all application logs and about 10K of them is from trades that I need to visualize.

Are the logs the same format for both?

Yes they are. Logs are pushed from app by Logback Elasticsearch Appender

Unless there's no easy way to filter between each log type, I wouldn't worry too much about splitting them out with that sort of volume.

I maybe misunderstood. I can have easy filter for that. When LogType= TrxPersist, it means that it is transaction. I can filter on that in Logstash. But I don't know if it will be better when these transactions will be in one index, or they will be in index with many other logs that I will not search for unless some debug or problem will start.

Elasticsearch should be find with this.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.