Logstash Syntax :: Force a Newly-Created Field to be an Integer?

Hi Logstash Gurus,

I need to build off another syntax question I asked, here. In that post, I learned how to use a Ruby Filter to create a new field, then populate that field with the product of other fields:

  ruby {
    code => '
      event.set( "[FieldC]", (event.get("[FieldA]").to_i  *  event.get("[FieldB]").to_i  *  5) )
    '
  }

This looked correct when I inspected the raw data. But later, I realized that FieldC is a string. This is a problem, as I need it to be an integer.

I tried the stupid solution by doing this:

  ruby {
    ...same solution from above...
  }
  mutate {
    convert => { "[FieldC]"  => "integer" }
  }

And then this:

  ruby {
    code => '
      event.set( "[FieldC]", (event.get("[FieldA]").to_i  *  event.get("[FieldB]").to_i  *  5).to_i )
    '
  }

(I added an “.to_i” at the end of the big math section)

But neither of these had an effect. (The online documentation also says “Mutating a collection after setting it in the Event has an undefined behaviour and is not allowed,” so I guess this was never going to work.)

A careful read of the Event API page (here) says:

Syntax: event.get(field)

Returns: Value for this field or nil if the field does not exist. Returned values
could be a string, numeric or timestamp scalar value.

But I don’t understand how you tell Logstash that FieldC is supposed to be an integer. Is there a way to force this? Thank you!

event.get may return integer, but event.set will never create one. That mutate+convert you tried is the right way to do it.

Thanks Badger,

So I tried to mutate FieldC into an integer, but the field remained as a string. But then when I created a new field from scratch:

ruby {
    code => '
      event.set( "[FieldC2]", (event.get("[FieldA]").to_i  *  event.get("[FieldB]").to_i  *  5) )
    '
  }
  mutate {
    convert => { "[FieldC2]"  => "integer" }
  }

This worked like a charm. So, all's well that ends well...

Thank you!

PS - Forgot to add, for anyone following this post... I am on Logstash 7.4.0, the Docker Container version

If you are talking about the type in elasticsearch, then once you index a field on a document as a string, it will be a string for every other document in that index, even if logstash sends an integer. If you rolled over to a new day's index then it would start getting indexed as an integer even with the old code.

Ohhhhh, you're right. I was changing the string to an integer in Logstash, then checking the data type in Kibana. Kibana, of course, was reporting on what it saw in Elasticsearch.

I'll have to research how to change the data type in ES... but for the moment, my workaround is just fine.

Many thanks!