Setting up Filebeats with the IIS module to parse IIS logs

Hi.

I'm trying to setup the Filebeats IIS module (link) so I can display IIS logs (version 10) in the canned Kibana Dashboards, however I get errors in Logstash when parsing the messages preventing them from display correctly.

[2020-09-17T08:26:31,673][WARN ][logstash.filters.json    ] Error parsing json {:source=>"message", :raw=>" /AppServer/Service.asmx - 443 - 172.16.10.37 Mozilla/4.0+(compatible;+MSIE+6.0;+MS+Web+Services+Client+Protocol+4.0.30319.42000) - 200 0 0 11", :exception=>#<LogStash::Json::ParserError: Unexpected character ('/' (code 47)): maybe a (non-standard) comment? (not recognized as one since Feature 'ALLOW_COMMENTS' not enabled for parser)

I tried to search for a feature called ALLOW_COMMENTS, but didn't find anything helpful.

Below are my configuration files ...

Logstash.config

input {
    beats {
		port => 5044
    }
}
  
filter {
    mutate {
        gsub => ["message", "^.{1,37}(.*)$","\1"]
    }
    json {
        source => "message"      
    }   
	
	if ([fields][log_type] == "diagnostics") {
		grok { 
			match => { "Timestamp" => "%{TIMESTAMP_ISO8601:logdate}" } 
		}
		
		date {
			match => ["logdate", "ISO8601"]
		}	
		if ("" in [TimeSpan]) {
			grok {
				match => { "TimeSpan" => "%{INT:hours}:%{INT:minutes}:%{INT:seconds}.%{INT:subsecond}" }
				
			}
			ruby {
				code => '
					subsecond = event.get("subsecond")
					if subsecond
						subsecond = subsecond.to_f / (10 ** subsecond.length)
						event.set("elapsed", 3600 * event.get("hours").to_f + 60 * event.get("minutes").to_f + event.get("seconds").to_f + subsecond)
					end
				'
				remove_field => ["hours", "minutes", "seconds", "subsecond"]
			}		
		}	
	}
	
	if ([fields][log_type] == "iis") {
	}
	
}
  
output {
	if ([fields][log_type] == "iis"){
		if [@metadata][pipeline] {
			elasticsearch {
				hosts => ["localhost:9200"]
				index => "filebeat-iis-%{+YYYY.MM.dd}"
				pipeline => "%{[@metadata][pipeline]}"
			}	
		}		
	} else {
		elasticsearch {
			hosts => ["localhost:9200"]
			index => "dclogstash-%{+YYYY.MM.dd}"
		}
	}
}

Filebeat.yml

filebeat.inputs:

- type: log
  enabled: true
  paths: 
    - C:\PerfElastic\Logs\*.json
  fields: 
    log_type: diagnostics    

#- type: log
#  enabled: true
#  paths: 
#    - C:\PerfElastic\Logs\IIS\IIS LogFiles - node *\LogFiles - node *\W3SVC1\*.log
#  fields: 
#    log_type: iis  

filebeat.config.modules:
  # Glob pattern for configuration loading
  path: ${path.config}/modules.d/*.yml

  # Set to true to enable config reloading
  reload.enabled: false

  # Period on which files under path should be checked for changes
  #reload.period: 10s

setup.kibana:
  host: "http://tsv-006394:5601"

output.logstash:
  hosts: ["localhost:5044"]

processors:
  - add_host_metadata: ~
  - add_cloud_metadata: ~
  - add_docker_metadata: ~
  - add_kubernetes_metadata: ~
  - decode_json_fields:
     fields: ['message']
     target: json
     overwrite_keys: true

iis.yml

- module: iis
  # Access logs
  access:
    enabled: true
    var.paths: ["C:/PerfElastic/Logs/IIS/*/*/*/*.log"]

  # Error logs
  error:
    enabled: false

    # Set custom paths for the log files. If left empty,
    # Filebeat will choose the paths depending on your OS.
    #var.paths:

Any help is appreciated on this. Thanks!

I finally figured out my issue which is likely caused by multiple misconfigurations.

The first is within my Logstash.config. I was processing all messages as JSON. As IIS messages are not in the JSON format, that was causing problems reading through the message. I moved the json filter into the if clause so only my diagnostics messages would be processed.

filter {
    mutate {
        gsub => ["message", "^.{1,37}(.*)$","\1"]
    }
  
	if ([fields][log_type] == "diagnostics") {	
		json {
			source => "message"      
		} 	
	
		grok { 
			match => { "Timestamp" => "%{TIMESTAMP_ISO8601:logdate}" } 
		}
		
		date {
			match => ["logdate", "ISO8601"]
		}	
		if ("" in [TimeSpan]) {
			grok {
				match => { "TimeSpan" => "%{INT:hours}:%{INT:minutes}:%{INT:seconds}.%{INT:subsecond}" }
				
			}
			ruby {
				code => '
					subsecond = event.get("subsecond")
					if subsecond
						subsecond = subsecond.to_f / (10 ** subsecond.length)
						event.set("elapsed", 3600 * event.get("hours").to_f + 60 * event.get("minutes").to_f + event.get("seconds").to_f + subsecond)
					end
				'
				remove_field => ["hours", "minutes", "seconds", "subsecond"]
			}		
		}	
	}	
}

Next, while there might be other ways around it, what I did was update my Filebeat.yml and set my Output to Elastic. This allowed me to process my IIS logs directly to Elastic instead of first to Logstash.

#================================ Outputs =====================================

# Configure what output to use when sending the data collected by the beat.

#-------------------------- Elasticsearch output ------------------------------
output.elasticsearch:
  # # # Array of hosts to connect to.
  hosts: ["localhost:9200"]

  # Protocol - either `http` (default) or `https`.
  protocol: "http"

  # Authentication credentials - either API key or username/password.
  #api_key: "id:api_key"
  # username: "elastic"
  # password: "changeme"

#----------------------------- Logstash output --------------------------------
#output.logstash:
  # The Logstash hosts
#  hosts: ["localhost:5044"]

  # Optional SSL. By default is off.
  # List of root certificates for HTTPS server verifications
  #ssl.certificate_authorities: ["/etc/pki/root/ca.pem"]

  # Certificate for SSL client authentication
  #ssl.certificate: "/etc/pki/client/cert.pem"

  # Client Certificate Key
  #ssl.key: "/etc/pki/client/cert.key"

Not sure if there is another way to handle the processing of some messages to Logstash and some directly to Elastic in a single Filebeat process, but this at least got my data into Elastic so I can move on.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.