Making Logstash configurations dynamic

Is there a way to make Logstash configuration file dynamic by reading attribute names and types from a different text file (or any other input mechanism).

To explain, if my logstash configuration file looks something like this:

if[@metadata][f] == "SCALE" {
    csv {
      separator => ","
      columns => ["TIMESTAMP",	"Machine_ID", "FLAG",	"Batch_ID",	"Scale",	"Op_Name",	"Set_Point", "Avg_Wght", "PPM",	"T1PPM"]
    }
    mutate { add_field => { "[@metadata][target_index]" => "dev-mw-1" } }
  }

The column names in this configuration are hard coded and each of the field is converted into its respective type using mutate.

Can I take these column names and their data type from some other input source (e.g. a text file), save them into a variable and use that variable? So that whenever Logstash is run, it takes fresh attribute from input and changes the configurations accordingly?

The logstash configuration can reference environment variables. However, it does not work everywhere. Suppose we have

COL1=foo
COL2=bar
COL1TYPE=integer
COL2TYPE=integer

Then the configuration

input { generator { count => 1 lines => [ '123,456' ] } }
filter {
    csv { columns => [ "${COL1}", "${COL2}" ] }
    mutate { convert => { "${COL1}" => "${COL1TYPE}" } }
    mutate { convert => { "bar" => "${COL2TYPE}" } }
}
output  { stdout { codec => rubydebug { metadata => false } } }

will result in

       "bar" => 456,
       "foo" => "123",

Note that bar is an integer, but foo is not. The left side of the convert is not substituted. Also a string that looks like an array will be interpreted as a string, not an array. So you cannot do

COLUMNS=foo, bar

and then use it like this

csv { columns => [ ${COLUMNS} ] }
1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.