Not Able to filter two type of events in logstash filter

Hi I am using filebeat to send beats to logstash,
Able to see in the logs of filebeat that event is generated after reading the file but ,not able to view the logs of logstash whether event is reaching to to logstash so to the elasticsearch.
Syslog evetns are getting processed but not tomcat log event.

filter {
if [type] == "syslog" {
grok {
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:[%{POSINT:syslog_pid}])?: %{GREEDYDATA:syslog_message}" }
add_field => [ "received_at", "%{@timestamp}" ]
add_field => [ "received_from", "%{host}" ]
}
syslog_pri { }
date {
match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
}
if [type] == "tomcat_log" {
grok{
match=>{"message" => "%{TIMESTAMP_ISO8601:date} %{LOGLEVEL:level} %{DATA:controller}:%{NUMBER} - %{DATA:log} [%{NUMBER:processTime}] ms"}
}
if "_grokparsefailure" in [tags] {
drop {}
}
}
}

How do you know they aren't hitting the drop?
You should check that the pattern works and the events are making it through the pipeline.

Because, i am not able to see them in Kibana.

Right, so start with your config like I mentioned.

I have 3 files under /conf.d

30-elasticsearch-output.conf

output {
elasticsearch {
hosts => ["localhost:9200"]
sniffing => true
manage_template => false
index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}
}

10-syslog-filter.conf

filter {
if [type] == "syslog" {
grok {
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:[%{POSINT:syslog_pid}])?: %{GREEDYDATA:syslog_message}" }
add_field => [ "received_at", "%{@timestamp}" ]
add_field => [ "received_from", "%{host}" ]
}
syslog_pri { }
date {
match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
}
}

02-beats-input.conf
input {
beats {
port => 5044
ssl => true
ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt"
ssl_key => "/etc/pki/tls/private/logstash-forwarder.key"
}
}

please help me with default tomcat log filter .
I am new to ELK . appreciate your help.
many a place i found only single conf file what is difference it makes to have 3 seperate configuration files and how can i add multiple filters for multiple filebeat sources. i.e for syslog and tomcat log.

i have the same config and i want to config firewall cisco asa
and i'm lost too

Please keep to your own thread.

many a place i found only single conf file what is difference it makes to have 3 seperate configuration files ...

None, really. It's up to you how you want to organize your files. Remember that the order of filters matters and that configuration files in a directory are processed in alphabetical order (which you seems to be aware of given the filenames you've chosen).

how can i add multiple filters for multiple filebeat sources. i.e for syslog and tomcat log.

What you've done so far looks good. To debug, disable the elasticsearch output and use a stdout { codec => rubydebug { metadata => true } } output instead. That way you remove one layer of complexity and source of confusion and see exactly what the processed events look like. What do your syslog and Tomcat events look like?

As suggested below is the beat i am able to see in stdout please help to filter in logstash.

{"message" => "2016-04-18 20:44:06 DEBUG OrderController:70 - createOrder [EquityOrder{side=Sell, orderType=Market, orderQualifier=DayOrder, accountType=Cash, quantity=1, stopPrice=null, limitPrice=null, notes='N', orderStatus=New, orderId=44}] processed in [19] ms",
"@version" => "1",
"@timestamp" => "2016-04-18T15:14:13.350Z",
"beat" => {
"hostname" => "",
"name" => ""
},
"count" => 1,
"fields" => nil,
"input_type" => "log",
"offset" => 501,
"source" => "/opt/apache-tomcat-9.0.0.M1/logs/order-ui.log",
"type" => "tomcat_log",
"host" => " ",
"tags" => [
[0] "beats_input_codec_plain_applied",
[1] "_grokparsefailure"
],
"@metadata" => {
"beat" => "filebeat",
"type" => "tomcat_log"
}
}

Okay. This looks good except that your grok filter is incorrect (that's why you get a _grokparsefailure tag). If you post your grok filter we can help you correct it.

filter {
if [type] == "syslog" {
grok {
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:[%{POSINT:syslog_pid}])?: %{GREEDYDATA:syslog_message}" }
add_field => [ "received_at", "%{@timestamp}" ]
add_field => [ "received_from", "%{host}" ]
}
syslog_pri { }
date {
match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
}
if [type] == "tomcat_log" {
grok {
match => {
"message" => '%{IPORHOST:clientip} %{USER:ident} %{USER:auth} [%{HTTPDATE:timestamp}] "%{WORD:verb} %{DATA:request} HTTP/%{NUMBER:httpversion}" %{NUMBER:r
esponse:int} (?:-|%{NUMBER:bytes:int}) %{QS:referrer} %{QS:agent}'
}
add_field => [ "received_at", "%{@timestamp}" ]
add_field => [ "received_from", "%{host}" ]
}
date {
match => [ "timestamp", "dd/MMM/YYYY:HH:mm:ss Z" ]
locale => en
}
geoip {
source => "clientip"
}
}
}

The grok expression

%{IPORHOST:clientip} %{USER:ident} %{USER:auth} [%{HTTPDATE:timestamp}] "%{WORD:verb} %{DATA:request} HTTP/%{NUMBER:httpversion}" %{NUMBER:response:int} (?:-|%{NUMBER:bytes:int}) %{QS:referrer} %{QS:agent}

doesn't even resemble this log entry:

2016-04-18 20:44:06 DEBUG OrderController:70 - createOrder [EquityOrder{side=Sell, orderType=Market, orderQualifier=DayOrder, accountType=Cash, quantity=1, stopPrice=null, limitPrice=null, notes='N', orderStatus=New, orderId=44}] processed in [19] ms

The grok expression you use is for HTTP logs but you're trying to parse an application log.

Yeah ,i am new to grok , appreciate your help till now ,please help me with some relevant materials to learn those grok .
as well as if the required grok for the existing problem will help a lot.

Grok expressions are basically regular expressions and there's lots of material around that cover those. One you understand the concept of regular expressions you shouldn't have any difficulties with grok expressions.

what should be the correct grok to read log?

I don't have time to craft an expression for you but maybe someone else can help. You really should try yourself though. The grokconstructor site can be of great help for beginners.