We built a solution leveraging the ELK to parse and generate reports out of AWS billing files. Currently we are indexing all the columns available in the log file as it is. But we found that the number of columns can be changed in the future like more columns can be added or dropped based on how user managing the resource tags names in aws.
How we handle this in ES or even at Logstash level so that the indexing process does not break even if the input file columns changes ?
Appreciate your help if anybody came across any similar cases and resolved it.