Hi. Elastic Stack 7.4 in use here. I'm using the grok filter in logstash to extract data from a connection event from a Cisco Firepower. The syslog data is sent to Filebeat on a separate computer and the Filebeat Cisco module . My filter successfully matches the event code and writes that to the document as the field event.code. At present I'm only trying to get one other field SrcPort matched and written to the records as client.port. There are no errors with the Logstash pipeline but the second match fails to produce a write to its field. I've refreshed the Index Pattern in Kibana and can see the field listed there.
Kibana Grok debugger has no issues with either of the match queries and successfully extracts the required data from the message
My filter is:
filter {
grok {
patterns_dir => [ "E:/ProgramFiles/logstash/config/patterns" ]
match => {
"message" => [
"<[>]",
"SrcPort:[\s]*(?<client.port>[0-9]{1,5}),"
]
}
}
}
An example syslog message is:
<118>Feb 6 16:22:14 fw-02-a web: Protocol: TCP, SrcIP: 192.168.0.1, OriginalClientIP: ::, DstIP: 52.184.92.48, SrcPort: 60756, DstPort: 80, TCPFlags: 0x0, IngressZone: Inside, EgressZone: Outside, DE: Primary Detection Engine (8ea39c90-7915-11e8-a1eb-cd0d65f0cc7a), Policy: policyname, ConnectType: End, AccessControlRuleName: inside_internet, AccessControlRuleAction: Allow, Prefilter Policy: DHCP & Terredo, UserName: username, UserAgent: MICROSOFT_DEVICE_METADATA_RETRIEVAL_CLIENT, Client: Web browser, ApplicationProtocol: HTTP, WebApplication: Microsoft, InitiatorPackets: 80, ResponderPackets: 69, InitiatorBytes: 43730, ResponderBytes: 50138, NAPPolicy: Balanced Security and Connectivity, DNSResponseType: No Error, Sinkhole: Unknown, ReferencedHost: dmd.metaservices.microsoft.com, URLCategory: Unknown, URLReputation: Risk unknown, URL: http://dmd.metaservices.microsoft.com/metadata.svc
Any help will be appreciated. Thanks.