Newbie: Grok Pattern for Cisco FirePower Conf file

I am trying to setup a basic test grok filter to grab syslog messages from "Cisco Sourcefire". To start I am only grabbing logs related to blocked events. Here is a sample log message:

Jul 28 18:52:51 CentralFP1 URLblacklist: Protocol: TCP, SrcIP: 10.10.10.50, DstIP: 8.19.136.249, SrcPort: 59012, DstPort: 80, TCPFlags: 0x0, IngressInterface: inside, EgressInterface: outside, IngressZone: inside, EgressZone: outside, DE: Primary Detection Engine (e7398b2e-f51b-11e5-bfa8-9ca93c497692), Policy: initial, ConnectType: Start, AccessControlRuleName: Block-Sites-all-District, AccessControlRuleAction: Block, UserAgent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/51.0.2704.106 Safari/537.36, Client: Chrome, ClientVersion: 51.0.2704.106, ApplicationProtocol: HTTP, InitiatorPackets: 3, ResponderPackets: 1, InitiatorBytes: 826, ResponderBytes: 62, NAPPolicy: Initial Inline Network Analysis Policy - Sourcefire3D.domain.com, DNSResponseType: No Error, Sinkhole: Unknown, HTTPReferer: http://popularscience.html, ReferencedHost: postback.advconversion.com, URLCategory: Malware Sites, URLReputation: High risk, URL: http://postback.advconversion.com

My Logstash Conf file:

input {
 udp { 
 port => 5545
}
}

filter {
 grok {
 match => { "message" => "%{SYSLOGTIMESTAMP:timestamp} %{HOST:host} %{WORD:blockedReason}: Protocol: %{WORD:protocol}, SrcIP: %{IP:srcIP}, DstIP: %{IP:dstIP}, SrcPort: %{NUMBER:srcPort}, DstPort: %{NUMBER:dstPort}, TCPFlags: %{WORD:tcpFlags}, IngressInterface: %{WORD:ingressINT}, EgressInterface: %{WORD:egressINT}, IngressZone: %{WORD:ingressZone}, EgressZone: %{WORD:egressZone}, DE: %{DATA:primaryDE}, Policy: %{WORD:policy}, ConnectType: %{WORD:connectType}, AccessControlRuleName: %{DATA:aclRuleName}, AccessControlRuleAction: %{WORD:aclRuleAction}, UserAgent: %{DATA:userAgent}, Client: %{WORD:client}, ClientVersion: %{DATA:clientVer}, ApplicationProtocol: %{WORD:appProtocol}, InitiatorPackets: %{NUMBER:initPackets}, ResponderPackets: %{NUMBER:responderPackets}, InitiatorBytes: %{NUMBER:initBytes}, ResponderBytes: %{NUMBER:respondBytes}, NAPPolicy: %{DATA:NaPPolicy}, DNSResponseType: %{DATA:dnsResponseType}, Sinkhole: %{WORD:sinkhole}, HTTPReferer: %{URI:httpReferer},  ReferencedHost: %{DATA:referencedHost}, URLCategory: %{DATA:urlCategory}, URLReputation: %{DATA:urlReputation}, URL: %{URI:URL}"}
}
}

output {
	elasticsearch {}
}

I am well aware that the filter is probably not the most efficent way of doing this but its a start to understanding for me. When I run the --configtest it passes and says the conf file is ok. When I go to start I get the following error:

Settings: Default pipeline workers: 4
Pipeline aborted due to error {:exception=>#<Grok::PatternError: pattern %{HOST:host} not defined>, :backtrace=>["/Users/tjones/Documents/ElasticSearch/logstash-2.3.4/vendor/bundle/jruby/1.9/gems/jls-grok-0.11.2/lib/grok-pure.rb:123:in `compile'", "org/jruby/RubyKernel.java:1479:in `loop'", "/Users/tjones/Documents/ElasticSearch/logstash-2.3.4/vendor/bundle/jruby/1.9/gems/jls-grok-0.11.2/lib/grok-pure.rb:93:in `compile'", "/Users/tjones/Documents/ElasticSearch/logstash-2.3.4/vendor/bundle/jruby/1.9/gems/logstash-filter-grok-2.0.5/lib/logstash/filters/grok.rb:264:in `register'", "org/jruby/RubyArray.java:1613:in `each'", "/Users/tjones/Documents/ElasticSearch/logstash-2.3.4/vendor/bundle/jruby/1.9/gems/logstash-filter-grok-2.0.5/lib/logstash/filters/grok.rb:259:in `register'", "org/jruby/RubyHash.java:1342:in `each'", "/Users/tjones/Documents/ElasticSearch/logstash-2.3.4/vendor/bundle/jruby/1.9/gems/logstash-filter-grok-2.0.5/lib/logstash/filters/grok.rb:255:in `register'", "/Users/tjones/Documents/ElasticSearch/logstash-2.3.4/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.4-java/lib/logstash/pipeline.rb:182:in `start_workers'", "org/jruby/RubyArray.java:1613:in `each'", "/Users/tjones/Documents/ElasticSearch/logstash-2.3.4/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.4-java/lib/logstash/pipeline.rb:182:in `start_workers'", "/Users/tjones/Documents/ElasticSearch/logstash-2.3.4/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.4-java/lib/logstash/pipeline.rb:136:in `run'", "/Users/tjones/Documents/ElasticSearch/logstash-2.3.4/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.4-java/lib/logstash/agent.rb:473:in `start_pipeline'"], :level=>:error}
stopping pipeline {:id=>"main"}

Is something wrong within my grok filter? Or am I completely off base and should approach this another way? I was looking at using Patterns but would like to get some basics down first.

Thanks

How about using the grok constructor if you haven't tried it yet.

http://grokconstructor.appspot.com/do/construction

Thats where I started. With a slightly different grok site.

https://grokdebug.herokuapp.com/

It came out fine there.

Hi,

  • Try %{HOSTNAME} instead of %{HOST}, it works for me.
  • 'host' is already a field name which is the machine where logstash runs, it's an array so value will be added.
  • Be careful, you've got 2 spaces in your pattern just between ',' and 'ReferencedHost' > _grokparsefailure

To test your grok pattern go there : http://grokconstructor.appspot.com/

@timestamp seems to not be filled by your timestamp : try date {}