Reading environment variables set with exec plugin

Hello, could you please help me with reading environment variables set with exec plugin in logstash. Generally, my input looks like this:

input{   
    exec{
       command=>
       „export FILENAME=$(ls -Arst /path/to/file | tail -n 1);
       unzip -p /path/to/file/$(ls -Arst /path/to/file | tail -n 1) folder\\\\file.json;
       zip -d /path/to/file/$(ls -Arst /path/to/file | tail -n 1) folder\\\\file.json > /dev/null 2>/dev/null;”
       Interval => 60
   }
    codec => json{charset => utf-16}
}

And thus, i unzip the json and print it to message. Since I can’t add anything to the output, not to spoil the json parsing, I would like to save the filename to environment variable and read it during the filtering.

I tried reading the variable few ways:

mutate{ add_field => { „my_field” => „${FILENAME}} }
or

ruby{ code => „str = ENV[‚FILENAME’]; event.set(„filename”,str.flatten)”}

But it returns the error, telling me that it couldn’t find the environment variable set.
Did any of You succeed in implementing such operation? I’d be grateful if anyone could help me in making such a parser.

I made few working parsers that adds the filename at the beggining of the output, but in that case, json input doesn’t work and i need to split the whole message with regex. It would be more elegant to do this in envvar.

No. The exec input creates a new process. Any environment variables set in that process will be available to that process. If you export them they will also be available to its child processes. They will never be available to the parent that did the exec.

1 Like

Thank you for info. That's a pity that it's impossible.
In my case the only solution was to do it entirely in filter{} section like this:

ruby{
    code => "str = `$(ls -Arst /path/to/file | tail -n 1)`
        event.set('field', str)"
}

So, it would work well, provided that none of the files changed during execution of the script. If the server is not very busy it may serve well.

Thanks a lot Badger!

Forking a process just to be able to run ls is really expensive. You might want to consider using File::Stat in the ruby filter instead, which would be far cheaper.

1 Like

Thanks for the suggestion again. This is not a big system, I use it to gather forensic artifacts from certain hosts in my network, but building something bigger I will definitely test it!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.