Getting external values into logstash; environment filter does not seen to work

I have a compressed log files, and am piping them into LS using stdin:
bzip2 -k -d --stdout logfile.bz2 | logstash -f myconfig.conf

This works, but I'd really like to record in each parsed record in Elasticsearch the file name which originated it. There seems to be no way to do that using command-line arguments to LS - no -D"fieldname=fieldvalue" or similar.

There is a contributed 'Enviroment' filter which supposedly let me set fields from Unix environment variables, but I can't seem to get it to work: It's not picking up the envar. LS complains about it not having a version number, and not
being supported. An examination of the source code suggests its unfinished.

The only working solution I have right now is write a template version of my conf file, and used sed to write a mutated version for each input file, but this is a very ugly solution.

Anyone have better suggestions?
thanks!
Peter

You can specify the Logstash configuration both through a file with -f and on the command line with -e (untested):

bzip2 -k -d --stdout logfile.bz2 | \
  logstash \
    -e "filter { mutate { add_field { 'path' => '$(pwd)/logfile.bz2' } } }' \
    -f myconfig.conf

Perhaps not beautiful but you won't can at least use the same configuration file for all input files.

The -e option is applied first and the configuration files are read afterwards, which is significant since you may want to have your manually added field available to other filters: