How to configure different indexes in logstash

Hi,

I'm trying to ingest multiple (two right now) different formatted logs via Filebeat into Logstash and have them sent to different indexes based on some criteria. The first thought was to use the source (path) to identify which log is being ingested and then use the source to determine which index the logs should be index into.

One of the sets of logs is a custom app log and the other is IIS logs. The intent is to ingest the logs into an index where I can use the Filebeat IIS module and Kibana Dashboards link.

Here is my logstash.config ...

input {
    beats {
		port => 5044
    }
}
  
filter {
    mutate {
        gsub => ["message", "^.{1,37}(.*)$","\1"]
    }
    json {
        source => "message"      
    }   
	
    grok {
        match => [ "source", "%{GREEDYDATA}\\W3SVC1\\%{DATA:iisLogSource}.log" ]
    } 
	
	grok { 
		match => { "Timestamp" => "%{TIMESTAMP_ISO8601:logdate}" } 
	}
	
	date {
		match => ["logdate", "ISO8601"]
	}	
	if ("" in [TimeSpan]) {
		grok {
			match => { "TimeSpan" => "%{INT:hours}:%{INT:minutes}:%{INT:seconds}.%{INT:subsecond}" }
			
		}
		ruby {
			code => '
				subsecond = event.get("subsecond")
				if subsecond
					subsecond = subsecond.to_f / (10 ** subsecond.length)
					event.set("elapsed", 3600 * event.get("hours").to_f + 60 * event.get("minutes").to_f + event.get("seconds").to_f + subsecond)
				end
			'
			remove_field => ["hours", "minutes", "seconds", "subsecond"]
		}		
	}	
}
  
output {
	if [iisLogSource] <> "" {
		elasticsearch {
			hosts => ["localhost:9200"]
			index => "filebeat-iis-%{+YYYY.MM.dd}"
		}		
	} else {
		elasticsearch {
			hosts => ["localhost:9200"]
			index => "dclogstash-%{+YYYY.MM.dd}"
		}
	}
}

I was trying to use iisLogSource as a generic variable allowing me to determine if the log came from a particular path (C:\SomePath\W3SVC1\filename.log).

Thanks for your help in advance.

If the 2 different sources go to 2 different indexes you can use tags on the filebeat side.

Thanks @aaron-nimocks.

I was able to find a solution which worked for me and I believe it was basically what you were referring to.

First I added field associated with the log in my filebeat.yml ...

filebeat.inputs:

- type: log
  enabled: true
  paths: 
    - C:\PerfElastic\Logs\*.json
  fields: 
    log_type: diagnostics    

- type: log
  enabled: true
  paths: 
    - C:\PerfElastic\Logs\testiis\*.log
  fields: 
    log_type: iis  

... and then in the logstash.conf, checked for the field in my output ...

output {
	if ([fields][log_type] == "iis"){
		elasticsearch {
			hosts => ["localhost:9200"]
			index => "filebeat-iis-%{+YYYY.MM.dd}"
		}		
	} else {
		elasticsearch {
			hosts => ["localhost:9200"]
			index => "dclogstash-%{+YYYY.MM.dd}"
		}
	}
}

Additional notes...

  • This post (link) was helpful
  • YAMLLINT was very helpful YAMLLINT when checking to make sure your syntax is correctly formatted

Hopefully this will help someone in the future.

@pheathers yes, that's what I was suggesting and glad you got it all worked out! :slight_smile:

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.