Filebeat + logstash IIS+kibana

This is my logstash config 20-iis-input.config

filter {
    if [type] == "iis" {
            if [message] =~ "^#" {
                            drop {}
            }
            grok {
                    break_on_match => false
					match => ["message", "%{TIMESTAMP_ISO8601:datetime} %{IPORHOST:server_ip} %{WORD:method} %{URIPATH:url} %{NOTSPACE:uriQuery} %{NUMBER:source_port} %{NOTSPACE:user} %{IP:client_ip} %{NOTSPACE:useragent} %{NOTSPACE:Referer} %{NUMBER:response} %{NUMBER:bytesSent} %{NUMBER:bytesReceived} %{NUMBER:timetaken}"]
					}

            date {

                    locale => "en"
                    match => [ "log_timestamp", "YYYY-MM-dd HH:mm:ss" ]
                    target => "@timestamp"
                    timezone => "Indian/Maldives"
            }
			
			            useragent {
                    source=> "useragent"
                    prefix=> "browser"
            }
			
			
}
}

Here are some sample logs

2017-08-08 00:02:54 11.12.1.33 GET /eAdxxxxs/HomePageRequestView - 443 A151290 10.12.20.254 Mozilla/5.0+(Windows+NT+6.1)+AppleWebKit/537.36+(KHTML,+like+Gecko)+Chrome/59.0.3071.115+Safari/537.36 https://cixxxv.mv/eAdmin/Admin/Home/Index 200 0 0 1856

2017-08-08 00:04:32 11.241.1.83 GET /WorkPermitAdmin/ - 443 - 10.12.51.254 Mozilla/5.0+(compatible;+PRTG+Network+Monitor+(www.paessler.com);+Windows) - 302 0 0 15

The grok filter also matches and I have tested it out at https://grokdebug.herokuapp.com/

Although still my elasticsearch is not getting the logs properly and rather just getting it inside message :"Full log"

my file beat config is below

- type: log
  enabled: true
  paths:
    - C:\inzzb\lzzs\Lozzziles\WzzC1\*.log
  document_type: iis
  encoding: utf-8
  exclude_lines: ["^#"]
  exclude_files: [".zip"]
  ignore_older: 24h

- type: log

any pointers, greatly appreciated.

Alright people, ANYONE? I am really losing my nerve on filebeat :expressionless: &

Debug each stage of your pipeline.

What does the event look like when it enters Logstash? Disable the filters and use the stdout output to log the event in Logstash. Then re-enable the filters and compare.

output { 
  stdout { 
    codec  => rubydebug {
      metadata => true
    }
  }
}
  • Check if the type is iis.
  • Are there any tags applied by Logstash? Some filters will add tags if an error occurs.

It would be help to know what versions of Filebeat, Logstash, and Elasticsearch are being used. You don't show how your LS inputs and outputs are configured so hopefully you have followed the recommendations in the documentation for your specific version.

hey thanks, I was able to correct my issue by adding type "iis" under fields.. :expressionless: as document_type: iis had no luck for me..

filebeat.prospectors:

  - input_type: log
    paths:
      - 'C:\inetpub\logs\LogFiles\*\*.log'
    encoding: utf-8
    exclude_lines: ["^#"]
#   exclude_files: ['.zip','.7z']
    ignore_older: 24h
    scan_frequency: 5s
    tail_files: true
    tags: ["iis"]
    fields:
      application_name: DELETED
      environment: production
      type: iis
    fields_under_root: true

The code is for anyone who has the same issue or has trouble getting this working.. Anyway I am getting some logs with _geoip_lookup_failure as the client IP is a local IP.. how to make geoip filter to ignore certain local ip ranges?

10.0.0.0/16 ?

You could filter based on the cidr filter. Use cidr to add a tag like rfc1918 then apply geoip only if that tag does not exist.

 cidr {
    add_tag => [ "geolocal" ]
    address => [ "%{client_ip}" ]
    network => [ "192.0.2.0/24" ]
  }
  
  if "geolocal" not in [tags] {
    	mutate { replace => { "[geoip][timezone]" => "Indian/Maldives" } }
    mutate { replace => { "[geoip][country_name]" => "Maldives" } }
    mutate { replace => { "[geoip][country_code2]" => "GNM" } }
    mutate { replace => { "[geoip][country_code3]" => "MV" } }
    mutate { add_field => { "[geoip][location]" => "73.5" } }
    mutate { add_field => { "[geoip][location]" => "4.167" } }
    mutate { convert => [ "[geoip][location]", "float" ] }
    mutate { replace => [ "[geoip][latitude]", 4.167 ] }
    mutate { convert => [ "[geoip][latitude]", "float" ] }
    mutate { replace => [ "[geoip][longitude]", 73.5 ] }
    mutate { convert => [ "[geoip][longitude]", "float" ] }
    }
	else {
	  geoip {
    source => "client_ip"
    target => "geoip"
    add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
    add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}"  ]
  }

This will work??

The conditional logic is what I expected. I don't believe the add_field args are necessary for the geoip filter. You'll want to make sure you have the geoip.location field marked as a geo_point in the index template used for your index.

not sure why but when i try that.. its adding my custom geo mapping to all client_ip which don't match 10.0.0.0/16 :smile:

Can anyone point out what am i doing wrong? I am getting the logs properly although anything which matches if [client_ip] =~ /^10./ { is not getting logged :frowning:

input {
  beats {
    client_inactivity_timeout => 1200
    port => 5044
    ssl => false
    ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt"
    ssl_key => "/etc/pki/tls/private/logstash-forwarder.key"
}
}
    filter {
      if [type] == "iis" {
        if [message] =~ "^#" {
                    drop {}
            }
           grok {
                match =>  ["message", "%{TIMESTAMP_ISO8601:datetime} %{IPORHOST:server_ip} %{WORD:method} %{URIPATH:url} %{NOTSPACE:uriQuery} %{NUMBER:source_port} %{NOTSPACE:user} %{IP:client_ip} %{NOTSPACE:useragent} %{NOTSPACE:Referer} %{NUMBER:response} %{NUMBER:bytesSent} %{NUMBER:bytesReceived} %{NUMBER:timetaken}"]
                }

            date {
                    locale => "en"
                    match => [ "log_timestamp", "YYYY-MM-dd HH:mm:ss" ]
                    target => "@timestamp"
                    timezone => "Indian/Maldives"
					}

                 useragent {
					source=> "useragent"
                    prefix=> "browser"
							}
				}
			
 if [client_ip] =~ /^10\./ {
   mutate { replace      => { "[geoip][timezone]"      => "Pacific/Auckland" } }
   mutate { replace      => { "[geoip][country_name]"  => "University of Otago" } }
   mutate { replace      => { "[geoip][country_code2]" => "UO" } }
   mutate { replace      => { "[geoip][country_code3]" => "UoO" } }
   mutate { remove_field => [ "[geoip][location]" ] }
   mutate { add_field    => { "[geoip][location]"      => "170.525" } }
   mutate { add_field    => { "[geoip][location]"      => "-45.865" } }
   mutate { convert      => [ "[geoip][location]",        "float" ] }
   mutate { replace      => [ "[geoip][latitude]",        -45.856 ] }
   mutate { convert      => [ "[geoip][latitude]",        "float" ] }
   mutate { replace      => [ "[geoip][longitude]",       170.525 ] }
   mutate { convert      => [ "[geoip][longitude]",       "float" ] }
   mutate { convert      => [ "[geoip][coordinates]", "float" ] }
  }	
  else {
geoip {
    source => "client_ip"
    target => "geoip"
    add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
    add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}"  ]
	}

  }
}

After much trouble I got it working :smiley:

  input {
  beats {
    client_inactivity_timeout => 1200
    port => 5044
    ssl => false
    ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt"
    ssl_key => "/etc/pki/tls/private/logstash-forwarder.key"
	}
}
    filter {
      if [type] == "iis" {
        if [message] =~ "^#" {
                    drop {}
            }
           grok {
                match =>  ["message", "%{TIMESTAMP_ISO8601:datetime} %{IPORHOST:server_ip} %{WORD:method} %{URIPATH:url} %{NOTSPACE:uriQuery} %{NUMBER:source_port} %{NOTSPACE:user} %{IP:client_ip} %{NOTSPACE:useragent} %{NOTSPACE:Referer} %{NUMBER:response} %{NUMBER:bytesSent} %{NUMBER:bytesReceived} %{NUMBER:timetaken}"]
                }

            date {
                    locale => "en"
                    match => [ "log_timestamp", "YYYY-MM-dd HH:mm:ss" ]
                    target => "@timestamp"
                    timezone => "Indian/Maldives"
					}

             useragent {
					source=> "useragent"
                    prefix=> "browser"
							}

			 cidr {
				add_tag => [ "geolocal" ]
				address => [ "%{client_ip}" ]
				network => [ "10.0.0.0/8" ]
			  }	

			geoip {
			source => "client_ip"
			target => "geoip"
			}
					
	        if "geolocal" in [tags] {
					mutate { replace      => { "[geoip][timezone]"      => "Indian/Maldives" } }
					mutate { replace	  => { "[geoip][ip]"            => "%{client_ip}" } }
					mutate { replace      => { "[geoip][latitude]"      => "4.1667" } }
					mutate { convert      => { "[geoip][latitude]"      =>  "float" } }
					mutate { replace      => { "[geoip][country_name]"  => "Maldives" } }
					mutate { replace      => { "[geoip][continent_code]" => "AS" } }
					mutate { replace      => { "[geoip][country_code2]" => "MV" } }
					mutate { replace      => { "[geoip][country_code3]" => "MV" } }
					mutate { replace      => { "[geoip][longitude]"     =>  "73.5" } }
					mutate { convert      => { "[geoip][longitude]"     => "float" } }
					mutate { convert      => { "[geoip][location]"      => "float" } }
					mutate { replace      => { "[geoip][location][lat]"        => "%{[geoip][latitude]}" } }
					mutate { convert	  => { "[geoip][location][lat]" => "float" } }	
					mutate { replace      => { "[geoip][location][lon]"       => "%{[geoip][longitude]}" } }
					mutate { convert	  => { "[geoip][location][lon]" => "float" } }	
					mutate { convert	  => { "[geoip][coordinates]" => "float" } }						
					mutate { add_tag	  => [ "Gov Network" ] }
					mutate { remove_tag   => [ "_geoip_lookup_failure" ] }
					 }


			  }	
			  }

although still I cant visualize on kibana "Region Map"

I get this error

"Region map: Could not show 5 results on the map. To avoid this, ensure that each term can be joined to a corresponding shape on that shape's join field. Could not join following terms: 301,500,304,404,302"

any ideas?

Glad you managed to get the data in.

Looking at the above "error" it seems like http error codes that somehow are mixed into your region map? Perhaps have a look at the JSON documents in Elasticsearch and check if these values show up where they should be.

Hey,

What should I select in the Field "" for getting the Region Map data? Nothing seems to work for me. I don't understand what i am doing at this point when trying to create the visualization..

localhost:9200/_template response

{"logstash":{"order":0,"version":50001,"template":"logstash-*","settings":{"index":{"refresh_interval":"5s"}},"mappings":{"_default_":{"_all":{"enabled":true,"norms":false},"dynamic_templates":[{"message_field":{"path_match":"message","match_mapping_type":"string","mapping":{"type":"text","norms":false}}},{"string_fields":{"match":"*","match_mapping_type":"string","mapping":{"type":"text","norms":false,"fields":{"keyword":{"type":"keyword","ignore_above":256}}}}}],"properties":{"@timestamp":{"type":"date","include_in_all":false},"@version":{"type":"keyword","include_in_all":false},"geoip":{"dynamic":true,"properties":{"ip":{"type":"ip"},"location":{"type":"geo_point"},"latitude":{"type":"half_float"},"longitude":{"type":"half_float"}}}}}},"aliases":{}}}

You have consistently left out the output block from your config, so I can not see what index you are writing to. Notice that the default template you have in place will only apply to indices that match the logstash-* pattern. If you are indexing into an index that does not match this, your mappings are most likely not correct.

Hey, Thanks this was my config.. So that means I should replace this to logstash-%{+YYYY.MM.dd}" ?

output {
elasticsearch {
hosts => "IPERASED:9200"
manage_template => false
index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}
}

The index template need to match the index name, so you can update either the template or the index name.

This topic was automatically closed after 21 days. New replies are no longer allowed.