Hello,
I currently use elasticsearch for my graylog server, everything works fine but disk space fills up too fast
I understood that it is possible to convert elasticsearch indexes into sql tables, which seems to me to be a good solution because I will then compress these famous sql data
I have not yet found anything to help me, that’s why I come to you
For my graylog server, it has only one node, and I’ve configured it to make a new index every day and the indexes start to be deleted after 365 days to keep one year of logs
--- /mnt/graylog/elasticsearch/nodes/0/indices ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
/..
133,8 GiB [##########] /qvR8mKiKQRWUHhXWxRAtGQ
29,0 GiB [## ] /pLWbpvYrRWCLmTbuImQ6iQ
28,5 GiB [## ] /YvnIgCv5QPugXtrGillH8w
26,8 GiB [## ] /dYCzPuBaR6qSAF-4PqwHuQ
20,2 GiB [# ] /hXiT4VpxRni39hg2qcN9-g
19,7 GiB [# ] /QpkZaipcTNSbyyjXZk3edg
19,7 GiB [# ] /zMacyH4ARgOYXjbFRgaUsA
19,1 GiB [# ] /Ph7nwRutQOe4e3lrm60K7Q
18,7 GiB [# ] /OzlEyYTgRW-ESstP7uqksA
17,7 GiB [# ] /ROkebCqiS2SKmtnoLd5K-g
16,0 GiB [# ] /P6DUFVqEQo-YseozDhn15A
16,0 GiB [# ] /AdixeDwmTTOs6fDIV8yYyg
15,2 GiB [# ] /ATP7FWk_SpSqchDm2nRIKA
12,9 GiB [ ] /zBebioJGRZyg3aVV2nv-Kw
12,7 GiB [ ] /XhmTQ3n2Roaxe6JCpX6aBQ
6,6 GiB [ ] /xlq-jtI4Twu0kJJsc-L9PA
6,3 GiB [ ] /YchNdqu6S-2oUMygO8EEBQ
3,4 GiB [ ] /YXxNwkPHRmmLz52Ths43zg
3,3 GiB [ ] /gMVH_r3zQKGz5cJT4M1Htw
3,3 MiB [ ] /X_qzLuCyTHSqdhqBeh-0qw
2,3 MiB [ ] /M1YpCI7FQ-eYWh2qHWPJdw
1,2 MiB [ ] /yJbEaRg7Qvu4pxjV3W0OiQ
88,0 KiB [ ] /Ze2VXGWKReq6ReoWw8uzXA
88,0 KiB [ ] /B2yRoFQDRje6sSZrwUZ1yg
84,0 KiB [ ] /WNniZ76vRZm_U0xiny7OUQ
84,0 KiB [ ] /BcNUyhiiS6unxRCK3PShlA
84,0 KiB [ ] /XAzvAgllTnO0GAyMEDX47Q
84,0 KiB [ ] /n47EEchMT4K1qOqysyR_yw
84,0 KiB [ ] /gQbHFxbFQe2bQ_9-CLRudg
36,0 KiB [ ] /mt3t8SLsQ_CNxdCCGgaFBw
24,0 KiB [ ] /JZu86JSRSqe41AjVMOhJcA
24,0 KiB [ ] /lNzkHX6ATle462uQwH3fWg
Thank you