Logstash Conditionals Efficiency/Best Practise?


#1

Hi I am quite interested in using Logstash conditionals to lookup the server hostname in an array to determine the rest of the pipeline, giving me a single pipeline/shipper file that covers all servers/environments.

A typical system will have between about 10-40 servers covering all qa/dev/production and the different geographical regions.

Anyone with a feel as to the efficiency for going for this method as obviously every log event will consist of a look up on the server hostname (out of 10-40) to determine which log file to input and to determine the Elastic Cluster in the output and a few other fields.

The Pro's are of course then just need to manage and deploy a single config across each system. The Cons are that it is more complex to setup initially but what about performance? Is Logstash robust enough to handle it? Anyone gone this route before? Is this a valid approach or a big no no?

Where applicable I have gone for using tags as I understand this is more efficient that creating new fields, but the output section could get a bit convoluted as there could be 6 different Elastic clusters possibilities!

if [host] in ["ServerA", "ServerA", "ServerC"] {
	mutate {
		add_field =>  {
		      "env" => "QA"
        }
        add_tag => [ "AMERS" ]		
    }
}

elseif [host] in ["ServerD"", "ServerE""] {
	mutate {
		add_field =>  {
		      "env" => "PROD"
        }
        add_tag => [ "EMEA" ]		
    }
}

Output

output {
if [env] == "QA" and "AMERS" in [tags] {
        elasticsearch {
	hosts => "elasticA:9200"
}
elseif [env] == "QA" and "EMEA" in [tags] {
        elasticsearch {
	hosts => "elasticB:9200"
   }
elseif [env] == "PROD" and "EMEA" in [tags] {
        elasticsearch {
	hosts => "elasticC:9200"
}
elseif [env] == "PROD" and "AMERS" in [tags] {
        elasticsearch {
	hosts => "elasticD:9200"
  	}		
}

(system) #2

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.