We have built a system that has to keep track of the number of calls a user does against our API, we also have to keep track of the logging info to build "audit" logs and give to the user the possibility to search and retrieve data.
So far, we have used graylog as a server where to stream the data, via udp.
Since I want to migrate to ELK & co (apm as well) what's the best/easiest solution to do?
We have a huge amount of data every second to stream to logstash and have them saved in elastic (we stream JSON). Should I create a UDP connection as well (does exist a library)? is there anything provided by logstash? I've check APM but it's mostly to evaluate performances and errors rather than collecting logs.
Our setup is in kubernetes and docker containers where the API codes runs. So streaming files may not be easier, while streaming directly from the app could be a bit better.
PS: is logstash able to parse the data JSON and maybe route them to different indexes? Since the data stored will be "huge" I would like to create indexed by users (this would be cool to export data, but almos impossible to remove data after X days) or divided in size of the docs or time (this makes easier to remove indexes after X days, right?). is this feasible?