Can't convert types using mutate or ruby filter from fields generated with kv

I have many fields generated from the kv plugin like so:

kv {
  source => "kvpairs"
  remove_field => [ "kvpairs" ]

I have tried both of the following and neither result in the fields being converted to their proper types.

#iterate through each key and attribute types via convetion
    ruby {
      code => "
        event.to_hash.each { |k, v|
          event[k] = v.to_f if k.end_with? '_f'
          event[k] = v.to_i if k.end_with? '_i'

After this failed, the new fields were indexed as String, I tried to manually convert them with the mutate plugin like this:

    #debugging: instead manually convert field type with mutate plugin
    mutate {
      convert =>  { 
      "duration_f" => "float"
      "order_lines_i" => "integer"

After this, I saw that the duration_f and order_lines_i fields were now 'unknown'. After I refreshed the index through Kibana 'refresh field list' button, I then saw that they were both indexed as strings. Any help is appreciated, thank you.

You can't change the mapping of an existing field in ES. You have to reindex.

Which version of Logstash is this?

Sorry, I was mistaken. My teammate actually did re-index in-between each attempt, not just 'refresh field list'. We are using logtash v. 5.4

It seems very unlikely that ES's automapper would map new fields that have float or integer values as strings. Anyway, you can look into setting up an index template that forces mappings of fields the way you want them. If you're using Hungarian notation ("_f" and "_i" suffixes to denote that type) you should be able to use wildcarding in your template so you won't have to list every single field and what its type should be.

Thanks for the advice, I got it sorted out. I have a theory that because the fields are generated with the kv filter, they are indexed as strings regardless of actual value.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.