I have a system that streams very 1 minute a series of values regarding its service to Logstash.
So basically, it streams about 50-60 variables on each event. With the variables there is also sent a transaction ID, so group all the variables into a single event in logstash should not be a problem.
The question is if this is something usefull to do.
Will this help save space in elastic or improve search time?
I'd probably look at the aggregate filter in logstash.
Thanks for the response. But my question is related to understand if the effort is useful. If it will produce any positive impacts on either disk consumption or elastic query performance.
Share some examples and you might be able to tell.
I mean that it depends on:
- What you are collecting
- What the users will search for (what kind of document they want)
- What are the fields needed to search for those documents.
App Statistics Like:
Number of TCP4-Querys, TCP4-Answers, TCP-Error.
Number of TCP6-Querys, TCP6-Answers, TCP6-Error.
CPU Time Consumed by app.
Memory Consumption by App.
This is not directly queried by the users. They are queried by Kibana to build monitoring dashboards.
Don't you have actual Json documents ?
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.