Different headers - logstash csv filter


I'm having issues when feeding CSV data into elastic.
My logstash .conf file monitors a results folder and injects new .csv data into elastic with the file input plugin.
The issue is that the .csv file contains different columns from time to time, depending on the test being performed.

The output header can be any combination depending on the actual results returned by the URL.
Only the first 15 lines are the same.



SSL variation

SSL variation 2

My .conf file is below:

input {
    file {
      path => "/var/lib/pbench-agent/user-benchmark_ose3_test_*/1/taurus.csv"
      start_position => "beginning"

filter {
  csv {
        separator => ','

        columns => ["avg_ct","avg_lt","avg_rt","concurrency",

  de_dot {
     fields => ["perc_95.0","perc_0.0","perc_99.9","perc_90.0",

output {
    elasticsearch {
        hosts => "gprfc076:9200"
        manage_template => false
        index => "bzt_pbench-%{+YYYY.MM.dd}"
    stdout { codec => rubydebug }


How could I create different types and send them to elastic according to the csv header line?
Wouldn't this mean changing the doc_type/mapping/.conf file (columns) and eventually restart logstash?

Any suggestions on how to handle this would be greatly appreciated!

Thank you,



Were you able to figure out a way to resolve this issue? I have had the same problem with my indexing problem. Please share some details if you can, would appreciate your help.