Hey,
I'm having hard time here to go with right choice. I'm not sure if logstash is right way to go, hope you guys can help me out a bit.
So, currently all our user tracking data is sent to elasticsearch directly which might not be best option, as documents get updated multiple times per second at worst.
We got data coming in and it gets processed to at few different formats.
Example 1: User event data
Example 2: Page event data
Would that kind of custom data processing be possible with logstash? It needs to update specific documents nested array data when needed.
This is what I have thought of:
- Send logs to stackdriver / google storage bucket
- Process logs with logstash to elasticsearch
or
- Send logs to stackdriver / google storage bucket
- Process logs with logstash to gzip files and send to google storage bucket
- Add reference to gzip file in elasticsearch
Or should I have different approach?
Help appreciated.