Hello,
Currently upgraded to 7.2 i am starting to use ELK, and trying to get efficiently.
I didn't notice fast enough that I could use Grok with filebeats, and send to logstash which sends afterwards to Elasticsearch.
So now, some logs (~50Gb) are actually already stored in elasticsearch and I need to parse the message tags values since I didn't do it beforehand.
Is there any way to parse the data that are already on it, so I could make visualizations efficiently afterwards ?
I swear, next time i'll use Grok patterns directly from filebeat to logstash beforehand !
Thanks