Search for IISLogs

How can I search for IISLogs ? What input_type: ??? should I be using ?

You can leave input_type alone. What you're probably talking about is the document_type option. It can have any contents but if you're using Logstash to parse the logs it needs to match your Logstash configuration (i.e. if Logstash parses logs of type "x" in a certain way you need to make sure Filebeat stamps your logs with type "x").

I am just using FileBeats and Elastic Search to do this. I will give a try with input_type only.

I am trying to load all my IIS Logs into ES, so that I can search for HttpStatusCode i.e. 200, 400. Below is a sample IIS Log where I got a 200 and a 400 response. I need to search for all 200 responses and 400 responses(sc-status)

#Fields: date time s-sitename s-computername s-ip cs-method cs-uri-stem cs-uri-query s-port cs-username c-ip cs-version cs(User-Agent) cs(Cookie) cs(Referer) cs-host sc-status sc-substatus sc-win32-status sc-bytes cs-bytes time-taken

2016-03-11 03:43:45 W3SVC2 Server001 123.34.123.123 POST /MyService/api/click2mycall/ - 8080 - 123.23.23.36 HTTP/1.1 Mozilla/5.0+(Windows+NT+6.1;+WOW64)+AppleWebKit/537.36+(KHTML,+like+Gecko)+Chrome/48.0.2564.116+Safari/537.36 - - Server001:8080** 200** 0 0 218 1076 11080

2016-03-11 03:43:59 W3SVC2 Server001 123.34.123.123 POST /MyService/api/orders/phonesearch - 8080 - 123.23.23.36 HTTP/1.1 Mozilla/5.0+(Windows+NT+6.1;+WOW64)+AppleWebKit/537.36+(KHTML,+like+Gecko)+Chrome/48.0.2564.116+Safari/537.36 - - Server001:8080 400 0 0 984 791 275

You should feed these logs to Logstash so that you can parse them and extract e.g. the HTTP status code.

Can I use FileBeat to parse them out?

No, Filebeat doesn't have any parsing features.

If you could share a possible sample link forlogstash parsing, it would be great

Have a look at https://www.elastic.co/guide/en/logstash/current/config-examples.html.

Can i use logstash to pull .txt files ??

If you by "pull .txt files" mean read and parse line-oriented text files with logs then yes. The documentation I linked to gives a couple of examples of how to do that.

send the logs via filebeat to logstash, use a filter to parse the data and then output to elasticsearch.

example header:

Fields: date time s-ip cs-method cs-uri-stem cs-uri-query s-port cs-username c-ip cs(User-Agent) sc-status sc-substatus sc-win32-status time-taken

logstash config:

input {
beats {
port => 5044
}
}

filter{
if ([message] =~ /^#/) {
drop { }
}
grok {
match => ["message", "%{TIMESTAMP_ISO8601:log_timestamp} %{IPORHOST:site} %{WORD:method} %{URIPATH:page} %{NOTSPACE:querystring} %{NUMBER:port} (%{NOTSPACE:username})? %{IPORHOST:clienthost} %{NOTSPACE:useragent} %{NUMBER:scstatus:int} %{NUMBER:scsubstatus} %{NUMBER:scwin32status} %{NUMBER:timetaken}"]
}
}

Thanks. Will give a try.