Which filter you use depends on the format of the data. In this case it seems like a natural fit for the csv filter. Sometimes you can also combine multiple filters, e.g. first separate out sections of the log using grok and then applying other filters to the various parts.
The csv filter parses everything as strings, so I do not know what you are trying to achieve with your convert statement. I also do not see any field named date log being extracted, so suspect your date filter may fail.
The csv filter parses all fields as strings, so you will need to convert field types. There is no special handling of a header line, so what I often do when I have csv files with a header is to drop the record if I see that one of the fields parsed contains the expected header title (assuming I know no data lines has this value).
You say that it is not working, but not what is wrong. For us to be able to help, please show an example raw event as well as the result of the processing when you output this to the stdout plugin with a rubydebug codec.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.