Issue with Filebeat and Multiple Logstash Conf files

I have 2 Logstash.conf files under Configurations folder. Logstash is configured to run using logstash -f \Configurations command. 2 conf files are nothing but IIS.conf and Services.conf

Everything worked smoothly using logstash to ES.

Now, I have FB to send data from multiple paths to logstash. No change to logstash command line.

So, when trigger my services...i can see logs getting captured for IIS only and not for my services.

Here is my FB configuration -

- input_type: log
  paths:
    - D:\ServiceLogs\Zephyr\*\*
    - D:\Zephyr\inetpub\LogFiles\*\*
  tags: ["Zephyr"]
  ignore_older: 1h
- input_type: log
  paths:
    - D:\ServiceLogs\Nightingale\*\*
    - D:\Nightingale\inetpub\LogFiles\*\*
  tags: ["IVR"]
  ignore_older: 1h
- input_type: log
  paths:
    - D:\FCServices\inetpub\LogFiles\*\*
  tags: ["FCServices"]
  ignore_older: 1h
- input_type: log
  paths:
    - D:\BreezeServices\inetpub\LogFiles\*\*
  tags: ["BreezeServices"]
  ignore_older: 1h

I think it is only reading IIS.conf and not Services.conf.

Please help.

Please post your two Logstash configs too. Are there any errors or warnings in the Filebeat logs?

There are no errors or warnings wen I debug. I have tried filebeat with individual logtash conf files and both work fine.
I can share the logtash files in sometime.

Logstash 1 -

input
{
	beats 
	{
		port => 5044
	}
}

filter
{
	kv 
	{
		value_split => ":"
		remove_char_key => "\[\]"
		remove_char_value => "\[\]"
		include_keys => [ "method", "reasonPhrase", "requestUri", "content", "Payload", "id", "ClientID" ]
		recursive => "true"
    }
	
	if "Zephyr" in [tags]
	{
		mutate
		{
			replace => { "type" => "Zephyr" }
		}
    }
	
    if "IVR" in [tags]
	{
		mutate
		{
			replace => { "type" => "IVR" }
		}
    }
	
	mutate
	{
		add_field => { "LogType" => "Services" }
		remove_field => [ "tags", "offset", "input_type", "beat" ]
	}
}

output 
{
	if [type] == "Zephyr"
	{
		elasticsearch 
		{
			index => "zephyr-%{+YYYY.MM.dd}"
			hosts => ["server:9200"]
		}
    }
	
    if [type] == "IVR" 
	{
		elasticsearch 
		{
			index => "ivr-%{+YYYY.MM.dd}"
			hosts => ["server:9200"]
		}
    }
	#stdout
	#{ 
	#	codec => rubydebug 
	#}
}

Logstash 2 -

input
{
	beats 
	{
		port => 5044
	}
}

filter
{
	#Ignore log comments
	if [message] =~ "^#" 
	{
		drop {}
	}
	
	grok 
	{ 
		match => 
		{
			"message" => "%{TIMESTAMP_ISO8601:log_timestamp} %{IPORHOST:site} %{WORD:HttpVerb} %{URIPATH:RequestUri} %{NOTSPACE:querystring} %{NUMBER:Port} %{NOTSPACE:username} %{IPORHOST:clienthost} %{NOTSPACE:useragent} %{NOTSPACE:username} %{NUMBER:ResponseCode} %{NUMBER:subresponse} %{NUMBER:scstatus} %{NUMBER:time_taken}"
		}
	}
	
	#Set the Event Timestamp from the log
	date 
	{
		match => [ "log_timestamp", "YYYY-MM-dd HH:mm:ss" ]
		timezone => "Etc/UTC"
	}
	
	#Ignore all values for which GROK pattern is not set at this moment.
	if "_grokparsefailure" in [tags]
	{
		drop {}
	}
	
	if "Zephyr" in [tags]
	{
		mutate
		{
			replace => { "type" => "Zephyr" }
		}
    }
	
    if "IVR" in [tags]
	{
		mutate
		{
			replace => { "type" => "IVR" }
		}
    }
	
	if "FCServices" in [tags]
	{
		mutate
		{
			replace => { "type" => "FCServices" }
		}
    }
	
    if "BreezeServices" in [tags]
	{
		mutate
		{
			replace => { "type" => "BreezeServices" }
		}
    }
	
	mutate
	{
		add_field => { "LogType" => "IIS" }
		remove_field => [ "@version", "log_timestamp", "site", "querystring", "username", "clienthost", "useragent", "subresponse", "scstatus", "tags", "offset", "input_type", "beat" ]
	}
}

output 
{
	if [type] == "Zephyr"
	{
		elasticsearch 
		{
			index => "zephyr-%{+YYYY.MM.dd}"
			hosts => ["server:9200"]
		}
    }
	
    if [type] == "IVR"
	{
		elasticsearch 
		{
			index => "ivr-%{+YYYY.MM.dd}"
			hosts => ["server:9200"]
		}
    }
	
	if [type] == "FCServices"
	{
		elasticsearch 
		{
			index => "fcservices-%{+YYYY.MM.dd}"
			hosts => ["server:9200"]
		}
    }
	
	if [type] == "BreezeServices"
	{
		elasticsearch 
		{
			index => "breezeservices-%{+YYYY.MM.dd}"
			hosts => ["server:9200"]
		}
    }
	#stdout
	#{ 
	#	codec => rubydebug 
	#}
}

I tried removing one the conf files from the Configuration folder. Then the logs started ingesting. Hence, its confirmed that only one conf file is getting read. Any idea how can I enable both the conf files to be working at the same time or can I make a single prospector to work with a specific conf file ?

Hello andrewkroh, any update on this ?

As of now I have merged both the conf files into one and things are working. But I need to separate it out as things will grow as time passes.

To be clear, the two configuration files are for Logstash, right? Perhaps it's better to move this question in the LS section where you'll get the attention of the LS experts. Or is Filebeat doing anything wrong here?

Thanks for the reply. I will try to post this on logstash section as well.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.