Convert all the fields to integer in csv filter

Hi,
I'm parsing the csv output of a command to logstash which has a dynamic number of columns. All the column values are interger except one which is a time field. By default logstash takes them all as strings. How can I convert all the dynamic columns to integer?
Below is my filter

filter {
  split {}

  csv {
    autodetect_column_names => true
  }

  date {
    match => ["TIME", "yyyy-MM-dd HH:mm:ss"]
  }

  mutate {
    remove_field => ["host"]
  }
}

Change your csv filter to

  csv {
    autodetect_column_names => true
    target => "[@metadata][csvData]"
  }

Then use a ruby filter. Something like

ruby {
    code => '
        event.get("[@metadata][csvData]").each { |k, v|
            unless [ "TIME" ].include? k
                event.set(k, v.to_i)
            else
                event.set(k, v)
            end
        }
    '
}

I use an array inclusion test in case you discover there are other fields you do not want to convert.

Thanks, it worked :+1:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.