Unable to parse IIS logs

Hi there

I am having some difficulty parsing IIS logs into the discrete fields.

The following tags keeps showing up:

"tags": [
  "beats_input_codec_plain_applied",
  "_grokparsefailure",
  "_geoip_lookup_failure"

If I test my filter using stdin and stdout rubydebug all works as intended:

{
        "cs_cookie" => "_ga=GA1.2.1673373544.1536632622;+selectionCriteria.BaseApp.GlobalSelectionCriteria=%7B%22lastUserLanguageCookie%22%3A%22en_gb%22%7D;+_gid=GA1.2.1287340382.1542184575;+token=8f15da6d-818e-422e-bb3d-d1db4a6c4ae0",
       "s-sitename" => "W3SVC1",
       "cs_referer" => "https://awesomesite.com/",
        "cs_status" => "200",
      "cs_username" => "-",
        "cs_method" => "PUT",
 "X-Forwarded-Port" => "443",
         "@version" => "1",
      "cs-uri-stem" => "/somestem",
     "cs_useragent" => "Mozilla/5.0+(Windows+NT+10.0;+Win64;+x64)+AppleWebKit/537.36+(KHTML,+like+Gecko)+Chrome/70.0.3538.102+Safari/537.36",
          "cs_host" => "awesomesite",
         "sc_bytes" => "1790",
         "cs_bytes" => "928",
       "@timestamp" => 2018-11-15T23:59:55.000Z,
       "time_taken" => "46",
"X-Forwarded-Proto" => "https",
            "geoip" => {
       "postal_code" => "1506",
       "region_name" => "Gauteng",
      "country_name" => "South Africa",
          "timezone" => "Africa/Johannesburg",
          "location" => {
        "lon" => 28.3167,
        "lat" => -26.1833
    },
                "ip" => "externalipwashere",
         "longitude" => 28.3167,
    "continent_code" => "AF",
         "city_name" => "Benoni",
          "latitude" => -26.1833,
     "country_code3" => "ZA",
       "region_code" => "GT",
     "country_code2" => "ZA"
},
           "s_port" => "80",
     "cs_substatus" => "0",
       "cs_version" => "HTTP/1.1",
             "host" => "somehost",
     "cs_uri_query" => "-",
           "X-auth" => "-",
  "sc_win32_status" => "0",
             "s_ip" => "ipwashere",
   "s-computername" => "somehostname"
}

Setup:

Filebeat 6.5
Logstash 6.4
Amazon_ES 6.3

Beats config:

input {
  beats {
    port => 5044
  }
}

filter {  
  if [message] =~ "^#" {
    drop {}
  }
   grok {
      match => {
     "message" => "%{TIMESTAMP_ISO8601:log_timestamp} %{WORD:s-sitename} %{NOTSPACE:s-computername} %{IPORHOST:s_ip} %{WORD:cs_method} %{URIPATH:cs-uri-stem} %{NOTSPACE:cs_uri_query} %{NUMBER:s_port} %{NOTSPACE:cs_username} %{NOTSPACE:cs_version} %{NOTSPACE:cs_useragent} %{NOTSPACE:cs_cookie} %{NOTSPACE:cs_referer} %{NOTSPACE:cs_host} %{NUMBER:cs_status} %{NUMBER:cs_substatus} %{NUMBER:sc_win32_status} %{NUMBER:sc_bytes} %{NUMBER:cs_bytes} %{NUMBER:time_taken} %{IPORHOST:ClientIP} %{NOTSPACE:X-Forwarded-Proto} %{NUMBER:X-Forwarded-Port} %{NOTSPACE:X-auth}"
      }
  remove_field => "message"
   }

   date {
      match => [
     "log_timestamp",
     "yyyy-MM-dd HH:mm:ss"
  ]
  target => "@timestamp"
  remove_field => "log_timestamp"
   }

   geoip {
  source => "ClientIP"
  target => "geoip"
  remove_field => "ClientIP"
   }
}

output {
amazon_es {
hosts => ["myendpoint"]
region => "myregion"
manage_template => false
index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
  }
}

Any help will be greatly appreciated

You should try using the IIS module in Filebeat instead of sending the events to Logstash. Much cleaner and everything is already done for you. You just need to enable the geoip and useragent plugins in Elasticsearch and then enable the module in Filebeat and configure the log paths for the module. You'd then just send the output directly to ES, bypassing Logstash completely, unless you needed any additional processing power from Logstash.

Hi Philip

That would have been my first choice but unfortunately Amazon ES does not have the plugin.
https://docs.aws.amazon.com/elasticsearch-service/latest/developerguide/aes-supported-plugins.html

The reason why I am having logstash do the heavy lifting :slight_smile:

Any other suggestions instead of not using aws_es?

Gotcha. Ok, take a look at the IIS module default.json pipeline file and make sure your grok pattern matches what is in there. I think there are multiple patterns that support different versions if IIS. I can't explain why it works with stdin and not beats input though.

I checked the pattern and it matches exactly. If I go check the fields in the filebeat index they are all present as well.

I wonder if it might be that it is recognizing the incoming log so the filter fails?

@JacovanZyl - what template are you using?

Also, do you get the same errors when trying a different output with real data as it's coming in? Like file, etc?

Hi @jamesspi

Apologies for the late reply re:Invent was amazing!!

I tested with the file output and the filter works beautifully.

This all seems to be pointing to AWS ES being the culprit :frowning:

I will start up my own ES instance to confirm.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.