Logs being sent to LS Server. Nothing showing up in Kibana Dashboard

I'm having trouble viewing the logs that are being sent to my logstash server. I'm using filebeat to ship HTTP Logs and IIS logs. Additionally, i'm using winlogbeat to ship my event logs. I receive the following message(s) when i start filebeat & Winlogbeat, i'm assuming this means the logs were sent to the log stash server successfully. They don't appear anywhere in Kibana. Please let me if you need any additional config files to help me out.

#Filebeat
C:\Program Files\filebeat>filebeat.exe -c filebeat.yml -e -v
2016/08/03 20:07:49.210982 geolite.go:24: INFO GeoIP disabled: No paths were set
under output.geoip.paths
2016/08/03 20:07:49.211982 logstash.go:106: INFO Max Retries set to: 3
2016/08/03 20:07:50.234085 transport.go:125: ERR SSL client failed to connect wi
th: dial tcp 10.100.20.222:5044: connectex: No connection could be made because
the target machine actively refused it.
2016/08/03 20:07:50.235085 outputs.go:126: INFO Activated logstash as output plu
gin.
2016/08/03 20:07:50.245086 log.go:113: INFO Harvester started for file: C:\Progr
am Files\HTTPLog\httperr133.log
2016/08/03 20:07:50.245086 log.go:113: INFO Harvester started for file: C:\Progr
am Files\HTTPLog\httperr133 - Copy.log
2016/08/03 20:07:50.242085 spooler.go:77: INFO Starting spooler: spool_size: 100
00; idle_timeout: 10s
2016/08/03 20:07:50.244086 prospector.go:143: INFO Starting prospector of type:
log
2016/08/03 20:07:50.246086 crawler.go:78: INFO All prospectors initialised with
5 states to persist
2016/08/03 20:07:50.246086 registrar.go:87: INFO Starting Registrar
2016/08/03 20:07:50.246086 publish.go:88: INFO Start sending events to output
2016/08/03 20:07:50.247086 log.go:113: INFO Harvester started for file: C:\Progr
am Files\IISlogs\u_ex160803.log
2016/08/03 20:07:50.245086 log.go:113: INFO Harvester started for file: C:\Progr
am Files\HTTPLog\httperr133 - Copy (2).log
2016/08/03 20:08:20.250086 log.go:113: INFO Harvester started for file: C:\Progr
am Files\HTTPLog\httperr133 - Copy (4).log
2016/08/03 20:08:20.251086 registrar.go:162: INFO Registry file updated. 6 state
s written.
2016/08/03 20:08:39.208982 publish.go:104: INFO Events sent: 6632
2016/08/03 20:08:39.217982 registrar.go:162: INFO Registry file updated. 6 state
s written.

#Winlogbeat

C:\Program Files\Winlogbeat>winlogbeat.exe -c winlogbeat.yml -e -v
2016/08/03 20:28:43.360360 geolite.go:24: INFO GeoIP disabled: No paths were set
under output.geoip.paths
2016/08/03 20:28:43.361360 logstash.go:106: INFO Max Retries set to: 3
2016/08/03 20:28:43.576360 outputs.go:126: INFO Activated logstash as output plu
gin.
2016/08/03 20:28:43.576360 publish.go:288: INFO Publisher name: My-Server
2016/08/03 20:28:43.579360 async.go:78: INFO Flush Interval set to: 1s
2016/08/03 20:28:43.579360 async.go:84: INFO Max Bulk Size set to: 2048
2016/08/03 20:28:43.580360 beat.go:147: INFO Init Beat: winlogbeat; Version: 1.2
.3
2016/08/03 20:28:43.580360 winlogbeat.go:87: INFO State will be read from and pe
rsisted to C:\ProgramData\winlogbeat.winlogbeat.yml
2016/08/03 20:28:43.612360 beat.go:173: INFO winlogbeat sucessfully setup. Start
running.
2016/08/03 20:28:43.632360 winlogbeat.go:200: WARN EventLog[Security] Open() err
or. No events will be read from this source. Access is denied.
2016/08/03 20:28:45.084360 winlogbeat.go:260: INFO EventLog[System] Successfully
published 50 events
2016/08/03 20:28:45.501360 winlogbeat.go:260: INFO EventLog[Application] Success
fully published 50 events
2016/08/03 20:28:45.907360 winlogbeat.go:260: INFO EventLog[System] Successfully
published 50 events

Thank you in advance!

How do you know they are processing through LS?
What does your config look like?

The fields are showing up in the settings -- index patterns. For both filebeat and winlogbeats logs.

#WinLogBeat Configuration
winlogbeat:
registry_file: C:/ProgramData/winlogbeat/.winlogbeat.yml

event_logs:
- name: Application
- name: Security
- name: System

output:
logstash:
hosts: ["myserver:5044"]

# A template is used to set the mapping in Elasticsearch
# By default template loading is disabled and no template is loaded.
# These settings can be adjusted to load your own template or overwrite existing ones
template:

  # Template name. By default the template name is winlogbeat.
  name: "winlogbeat"

  # Path to template file
  path: "winlogbeat.template.json"

  # Overwrite existing template
  #overwrite: false
tls:
   # List of root certificates for server verifications
  certificate_authorities: ["C:/Program Files/winlogbeat//logstash-forwarder.crt"]

logging:
to_files: true
files:
path: C:/ProgramData/winlogbeat/Logs
level: info

#Filebeat Configuration
filebeat:
spool_size: 10000
idle_timeout: 10s

prospectors:
-
type: IISLog
document_type: IISLog
paths:
- "C:/Program Files/IISlogs/"
-
type: HTTPLog
document_type: HTTPLog
paths:
- "C:/Program Files/HTTPLog/
"

registry_file: "C:/ProgramData/filebeat/registry"

output:
### Logstash as output
logstash:
# The Logstash hosts
enabled: true
hosts:
- "MyServer:5044"
bulk_max_size: 1024
index: "logstash"
template:

  # Template name. By default the template name is filebeat.
  name: "logstash"

  # Path to template file
  path: "C:/Program Files/Filebeat/logstash-template.json"

  # Overwrite existing template
  overwrite: true

tls:
   # List of root certificates for server verifications
  certificate_authorities: ["C:/Program Files/Filebeat//logstash-forwarder.crt"]

shipper:

logging:
level: info
to_files: true
to_syslog: false

files:
path: C:/Filebeat
name: filebeat.log
keepfiles: 5

#Log Stash Conf
1_beats_inputs.conf
input {
beats {
port => 5044
ssl => true
ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt"
ssl_key => "/etc/pki/tls/private/logstash-forwarder.key"
}
}

2_iis_filters.conf
filter {
if [message] =~ "^#" {drop {}}

#IIS
if [type] == "IISLog" {
        grok {
              match => [
	"message","(%{TIMESTAMP_ISO8601:log_timestamp})\s(?<Server_Sitename>(.*?))\s(?<Server_IP>(.*?))\s(?<HTTP_Request_Method>(.*?))\s(?<IIS_Client_Query>(.*?))\s(?<Client_Stem>(.*?))\s(?<Server_Port>(.*?))\s(?<Client_ID>(.*?))\s(?<Client_IP>(.*?))\s(?<User_Agent>(.*?))\s(?<Referer>(.*?))\s(?<HTTP_Status_Code>(?:[4-5_.-]+[0-9_.-]+[0-9_.-]))\s(?<Server_Substatus>(.*?))\s(?<Server_Win32_Status>(.*?))\s(%{GREEDYDATA:Request_Time_Elapsed})",
	"message","(%{TIMESTAMP_ISO8601:log_timestamp})\s(?<Server_IP>(.*?))\s(?<HTTP_Request_Method>(.*?))\s(?<IIS_Client_Query>(.*?))\s(?<Client_Stem>(.*?))\s(?<Server_Port>(.*?))\s(?<Client_ID>(.*?))\s(?<Client_IP>(.*?))\s(?<User_Agent>(.*?))\s(?<Referer>(.*?))\s(?<HTTP_Status_Code>(?:[4-5_.-]+[0-9_.-]+[0-9_.-]))\s(?<Server_Substatus>(.*?))\s(?<Server_Win32_Status>(.*?))\s(%{GREEDYDATA:Request_Time_Elapsed})",
	"message","(%{TIMESTAMP_ISO8601:log_timestamp})\s(?<Server_IP>(.*?))\s(?<HTTP_Request_Method>(.*?))\s(?<IIS_Client_Query>(.*?))\s(?<Client_Stem>(.*?))\s(?<Server_Port>(.*?))\s(?<Client_ID>(.*?))\s(?<Client_IP>(.*?))\s(?<User_Agent>(.*?))\s(?<HTTP_Status_Code>(?:[4-5_.-]+[0-9_.-]+[0-9_.-]))\s(?<Server_Substatus>(.*?))\s(?<Server_Win32_Status>(.*?))\s(%{GREEDYDATA:Request_Time_Elapsed})"
         	]

              add_tag => "IIS"}
		date {
        match => [ "log_timestamp", "YYYY-MM-dd HH:mm:ss" ]
        timezone => "Etc/UCT"
	}
}

3_output.conf
output {
elasticsearch {
hosts => ["localhost:9200"]
}
}

Hey guys, any update on this? I'm still stuck.

Start with the basics, a simple config, then work forwards.