Logstash output using environment variable

hi

I'm trying to send my logs to an additional output when a certain field's value is a member in a given list.
The way I go about this is wrapping the additional output part in the pipeline with an "if" condition and testing the field against a regex that will have the list like so:

output {
  ...
  if [_ACCOUNTID] =~ /(valA|valB)/ {
    coralogix {...}
  }
}

when using the regex string hardcoded inside the pipeline code, it works as expected, and logs are showing up in the output.

but then, the values are changed from time to time, so I need to be able to change them (preferably automatically), so I want to have the regex string in some kind of variable, and every time the value changes I'll restart logstash.
so I tried to use this:

output {
  ...
  if [_ACCOUNTID] =~ /(${duplicate_logs_accounts})/ {
    coralogix {...}
  }
}

and now this stopped working, no logs are sent to the output inside the condition.

I'm using logstash docker image 7.16.1 (docker.elastic.co/logstash/logstash:7.16.1). I'm open to upgrading if it might help

I also looked at the docs and they suggest doing exactly what I did, so I'm confused.

also, if there's any way to load some list in a better way (say, a file that I can mount to the container), I'm really open to

please help!

Envinroment variables does not work everywhere, as the documentation that you already linked says:

  • You can add environment variable references in any plugin option type : string, number, boolean, array, or hash.

This is not the case of the if conditional, so the ${duplicate_logs_accounts} is treated as a literal string and the value is not expanded for the value of the variable.

I think that one alternative for your use case would be to combine a couple of mutate filters and a translate filter.

It will be something like this:

First you would add a metadata field to be used in the translate filter, making it to always have a match.

mutate {
    add_field => { "[@metadata][force_translate]" => "1" }
}

Then, you need to set up the translate filter, pointing to a dictionary on an external file to make it easier to update.

translate {
    dictionary_path =>  "/path/to/the/dictionary.yml"
    source => "[@metadata][force_translate]"
    target => "[@metadata][translated]"
    referesh_interval => 60
}

This translate filter will look into the dictionary.yml file for a key with the value of the field [@metadata][force_translate], which is been hardcoded to "1", if it is found, it will put that value in the field [@metadata][translated], it will also check the file every 60 seconds for any update.

The file dictionary.yml should have the following format.

"1": "valA,valB,valC,valN"

After the translate filter runs, you will have the field [@metadata][translated] with the following value.

[@metadata][translated] = "valA,valB,valC,valN"

Now you need to transform this field into an array, to do that use another mutate filter.

mutate {
    split => { "[@metadata][translated]" => "," }
}

This will give you:

[@metadata][translated] = [ "valA", "valB", "valC", "valN" ]

Now you can change your conditional to:

if [_ACCOUNTID] in [@metadata][translated] {
    actions
}

This will test the value of the field _ACCOUNTID against each value in the array [@metadata][translated].

The @metadata fields only exists in the pipeline, they will not be exported to the output.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.