I have a log entry from mysql_audit.so that creates a CSV based log file.
I have been able to filter this in to ELK no problems, but I have one type of log entry that contains comma delimited entries within the column.
an example would be:
<field1>, <field2>, \'<field3, field3.1, field3.2, field3.3, field3.4>\', <field4>
The CSV filter doesn't recognize the escaped characters but does recognize the commas within the escaped field.
This number of entries for is variable I have seen field3.1 - 3.6.
I can create a separate sub filter for this entry type based on fields within the entry. But currently cannot find a way to join all of the 3.x fields together into a single column.
I have tried using add_field, but any extra columns become a string in the log entry and I would like to keep in its own column.
My filter defines the name for each column and when the filter hits this entry type I end up with system defined columns ie "column4 column5 .... " as I end up with an overflow of column names.
I suspect that I may have to use the ruby filter to sort this out, only issue is my ruby.foo is not strong.
Unfortunately this system is air gapped from the internet and I have to manually type any data across.
Just wondering if the brains trust, may be able to help out in solving this problem.