I am using Logstash to grab a file from a monitored folder to import my Symantec WSS (Bluecoat) Cloud Proxy logs.
I have looked at and used the information from the following two old (but still very useful) websites:
Everything works properly and my logs are parsing as they should but I need the ability to change the locations of some of the columns in order to export the data to my SIEM in a compatible format. If I was using a Bluecoat ProxySG, then I could select which headers to include and the location of the columns. However, with the Cloud Proxy, you only get an hourly dump of the logs with the columns setup in a specific manner; which I cannot change nor can the Cloud Provider.
##Current CSV config to grab Column Headers
filter {
csv {
columns => ["timestamp","user,account","result,source_ip","service","geoip_organization","geoip_country_code","geoip_country_name","geoip_city","geoip_region"]
separator => ","
}
}
I need something that can take the above columns and the 'cut' them to a different location location: item in 'Column B or 2' is 'cut and pasted' to 'Column A or 1'.
I am currently using a VERY rudimentary Powershell script that calls Excel (which you technically should not do) to modify the columns in VB and and then re-saves the document with the columns in the proper location. Even with Error-Checking, this script is not reliable and I would like to use the ELK stack to process these logs as I need them.
Any help or pointers with this would be greatly beneficial and if it is not possible with Logstash's CSV module, please let me know.
Thank you