Elasticsearch: Project hierarchy

Hi, I am new to ElK stack. i am using it to centralize logs for the test runs.
I have a suite of 50 test cases which runs on the nightly basis and generate some tests and system-side data. I want to store this data into Elasticsearch for that I am already using filebeat and logstash. My question is What would be the approach here do i need to create a separate index for each test and push data into it or how will this work?

You don't need a separate index for this, a simple approach would be to index the filename, then you can simply filter on it on the dashboard

Thanks, @flash1293 for the reply.
Actually, for one test I have 3 different log files which include mobile logs, system logs and some debug logs. so as you mentioned if I create one index for all 50 tests and push data into it then I am wondering index will contain thousands of documtes (How many documents an index can contain) and will it be easy to query that big index?
Sorry for the very basic questions.

Thousands of documents is not an issue at all, Elasticsearch can handle many millions of documents in a single index.

A common approach is it to split indices by time (e.g. one per month) and delete the old ones after a while (this can also be automated).

In Kibana you can define a single data view sourcing documents from multiple indices using a wildcard index name pattern.

1 Like

Thank you for the support :slight_smile:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.