Grok Pattern Help with message parsing

Hi all i am fairly new to Elasticsearch. I have Elasticsearc running and i have some information in my logs that i need to graph and i just don't know how filter it. So here are the contents in my message field that i need filtered.


Here is my current filter
if [fields][log_type] == "syslog" {
grok {
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} (?:[%{POSINT:syslog_pid}])?: %{GREEDYDATA:syslog_message}" }
add_field => [ "received_at", "%{@timestamp}" ]
add_field => [ "received_from", "%{host}" ]
syslog_pri { }
date {
match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]

I would ideally like to filter "dropped_packets":0,"tcp":12025,"jobsRX":12104,"jobsTX":11476,"jobsCnt":9934,"fpTX":0,"tpRX":0,"leadsRX":11515,"leedsCnt":68749 so that i can graph them in kibana 4. Any help will be much appreciated. Thanks in advance.

Your pattern does not match the event at all.

I'll move this to the LS section as it seems that is the core of the problem.

Any chance of telling me what patterns i need to use to accomplish my task?

I'd suggest you take a look at

if your message looks like this i would recommend an kv filter, sth like that (not tested at all):

kv {
source => "message"
field_split => ","
value_split => ":"
trim => """
trimkey => """

this will give u field/value combinations like they appear in the log and u can then use mutate filter to rename the field as u like