Grok in realtime fails but in Debuuger works happy

Hello Everyone
im trying to parse my fotigate firewal logs with Grok Plese help
im getting this failure message "_grokparsefailure_sysloginput" (kibana gui) but Grok debuggr works fine
im using logstash 1.5.0-rc3
java version "1.7.0_65"
Any help will appriciate

filter{
if [type] == "syslog" {
grok {
match => [
'message', ".date=(?:%{YEAR:fw_year})-(?:%{MONTHNUM:fw_month})-(?:%{MONTHDAY:fw_day}).(?:%{NOTSPACE}).devname(?:%{NOTSPACE:fw_name}).devid=(?:%{NOTSPACE:serialid}).(?:%{NOTSPACE}).type=(?:%{NOTSPACE:utm}).subtype=(?:%{NOTSPACE:IPS}).eventtype=(?:%{NOTSPACE:signature}).level=(?:%{NOTSPACE:ips-alert}).(?:%{NOTSPACE}).severity=(?:%{NOTSPACE:severity}).srcip=(?:%{NOTSPACE:srcip}).dstip=(?:%{NOTSPACE:dstip}).(?:%{NOTSPACE}).(?:%{NOTSPACE}).(?:%{NOTSPACE}).(?:%{NOTSPACE}).(?:%{NOTSPACE}).status=(?:%{NOTSPACE:status}).proto=(?:%{NOTSPACE:protocol}).service=(?:%{NOTSPACE:service}).(?:%{NOTSPACE}).attackname(?:%{NOTSPACE:attack}).(?:%{NOTSPACE}).dstport=(?:%{NOTSPACE:dstport}).attackid=(?:%{NOTSPACE:attackid}).sensor=(?:%{NOTSPACE:sensor}).ref=(?:%{NOTSPACE:reffrenceurl})*.(?:%{GREEDYDA
TA})"i
]
}

What do the messages you're trying to parse look like?

*This is one of the messages that comes from my fortigate firewall *

<157>date=2015-05-11 time=10:08:16 devname=RUMOFW01 devid=FG100D3G14823541
logid=0317013312 type=utm subtype=webfilter eventtype=ftgd_allow
level=notice vd="vpn-s2s" policyid=8893 identidx=0 sessionid=399023432
srcip=172.18.2.28 srcport=50123 srcintf="port7" dstip=87.251.132.211
dstport=80 dstintf="wan1" service="http" hostname="gadgets.live.com
http://gadgets.live.com" profiletype="Webfilter_Profile"
profile="ECI-WEB-Filter-All" status="passthrough" reqtype="direct"
url="/configW7.xml" sentbyte=526 rcvdbyte=419 msg="URL belongs to an
allowed category in policy" method=domain class=0 cat=41 catdesc="Search
Engines and Portals"cvdbyte=419 msg="URL belongs to an allowed category in
policy" method=domain class=0 cat=41 catdesc="Search Engines

The grok filter isn't always the best choice. In this case use the kv filter.

filter {
  grok {
    match => [
      "message",
      "<%{INT:syslog_pri}>%{GREEDYDATA:message}"
    ]
    overwrite => "message"
  }
  kv { }
}

Okay thanks i wiil try it and update you
Tal

im sorry id didn't change im still getting "_grokparsefailure_sysloginput"
and all the fields are still in the "message" not sperated

These are all my conf files
input.conf output.conf and filter.conf

input {
syslog {
host => "x.x.x.x.x"
port => 514
codec => plain {
charset => "ISO-8859-1" }
}
}

output {
elasticsearch {
protocol => "http"
}
}

filter {
grok {
match => [
"message",
"<%{INT:syslog_pri}>%{GREEDYDATA:message}"
]
overwrite => "message"
}
kv { }
}

Oh, now I get it. You get a _grokparsefailure_sysloginput tag, indicating that it's the syslog input's use of grok that's failing:

The subsequent grok and kv filter should still be successful though.

Thank you but not sure i follow you
how exectly to i do it ? can you show me please !
Tal

You can't prevent the initial addition of the _grokparsefailure_sysloginput tag (but you can remove it afterwards). As I said your grok and kv filters should be working and if they really don't I suggest you scale down your configuration to a minimal example that we can use for debugging. For example, drop inputs and output replace with this:

input {
  stdin { }
}

output {
  stdout {
    codec => rubydebug
  }
}

Then feed the string you think you're getting from syslog to Logstash. The results of that should make it easier to figure out what's going on and decide on a next step.

Hi
i have installed the suggested plugin
bin/plugin install logstash-input-syslog

But i still get the nusty message
"tags _grokparsefailure_sysloginput"

If the plugin wasn't installed you'd have bigger problems.

I repeat: You can't prevent the initial addition of the _grokparsefailure_sysloginput tag. Focus on the real problem instead. I have described the steps I think you should take to debug why your grok and kv filters aren't working.

Do you have suggestion how to debug this ?
i see no errors in the logstash.log ?

I gave you some suggestions in my last response ("I suggest you scale down your configuration ...").

Hi Magnus Bäck
After a week of debugging i think i have a lead to find out the root of grok failure
when i use "file input" The grok is parsing the "message" with no failure and i can see all fields correctly
this input works perfect !!
file {
type => "fortigate"
path => [ "/var/log/*.log"]
}
}

But when i use logstash input syslog
i get grokfailure

The message apperas in the kibana with an ascii character like this
<157>date=2015-05-19 time=10:04:32 devname=USFLFW01-MASTER
i have reason to believe that syslog message is been distorted somehow

input {
syslog {
host => "x.x.x.x"
port => 514
}
}

any advice will be appreciated
Tia

The leading <157> is expected; see http://stackoverflow.com/a/30143229/414355 for an explanation.