Kibana configuration on a server

So i setup Kibana, elasticsearch and logstash on a staging server and i used a simple logstash config file with the following content:

vi logstash-filter.conf

input { stdin { } }

filter {
grok {
match => { "message" => "%{COMBINEDAPACHELOG}" }
}
date {
match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
}
}

output {
elasticsearch { hosts => ["10.92.28.131:9200"] }
stdout { codec => rubydebug }
}

My question is..what do i have to do to have my kibana look like the following:

Also where can i find logstash config files?

Can you provide me info with configuring the logstash to integrate with the logs getting collected on the Hadoop server?

Moving to #logstash since this seems to be mostly logstash questions.

thanks, my bad..

You're using a really old version of KB there. V3 is no longer supported.

I installed the latest version of kibana..

The one in the picture...i don't know if it's latest or not..

I want to analyse large data sets and I believe that will be possible if i use large logstash-config files.....where can i find logstash config files to use?

There are some examples here - https://github.com/elastic/examples