Time based Log Data - Index Modeling - Log Data Multiple Applications

Will apprecaite if someone can please help in structuring data properly on ES .

My log data from 10 different applications (Ap1 , Ap2....A10) are loaded in ES. Currently index is per day. Logstash_2018.10.01 houses all log data for all 10 applications that was generated on 2018.10.01. The data streams in with minimal lag.

My window for data retention is 7 days. Access Pattern is mostly the data that came in last few minutes /hours is reall the hottest. Access for older data especailly OTHER than today and yesterday is rare but possible. In general will access today data (80%) of time. Today and Yesterday data (15%) . Rest 5% where data from last 7 days might be accessed.

From this data - Questions are normally always asked in context of an app. Typical example willl for App1 show me any error or exception that happened today or last few mins/hours, as this being log analytics application.

Kibana (95%) will be primary query/visualization engine,

Will appreciate if I can get some pointers to design my system better than current one index per day encompassing all App data. I read multiple things like alias , routing , alias but was not sure how . Being a novice any advice will be greatly appreciated

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.