Grok & firewall logs

Hi,
I have some firewall logs sent to "syslog" and i have filebeat read the syslog, send it to Logstash then to Elasticsearch.

with this filter(1):
grok {
match => { "message" => "%{TIMESTAMP_ISO8601:timestamp}" }
}

the documents shows in Elasticsearch (as it is)

but with this filter(2):
match => { "message" => "%{TIMESTAMP_ISO8601:timestamp} %{GREEDYDATA:message}" }

the message part of the document are duplicated (or i say, appended twice into message field). The stdout shows"
message[0]=> ..........
message[1]=>........

so i just used filter(1).

this is my sample fw log:

Sep 23 18:08:59 zzfwm01 <134>1 2019-09-23T18:08:59Z zzfwm01 CheckPoint 9293 - [action:"Accept"; flags:"18692"; ifdir:"inbound"; ifname:"eth1"; loguid:"{0x5d890a3b,0x12,0xfc5710ac,0xc0000001}";
origin:"192.168.87.252"; time:"1569262139"; version:"1"; __policy_id_tag:"product=VPN-1 & FireWall-1[db_tag={EF33D65F-CD17-7A46-86C0-517C141DCAAF};mgmt=zzfwm01;date=1568814820;policy_name=BOGOcity]";
dst:"123.124.125.126"; nat_addtnl_rulenum:"1"; nat_rulenum:"22"; origin_sic_name:"CN=zzfwi03,O=host103x.bogocity.com."; product:"VPN-1 & FireWall-1"; proto:"6"; rule:"415";
rule_uid:"{7DA9A9DA-194A-479D-B659-08AC56036FDF}"; s_port:"14346"; service:"443"; service_id:"https"; src:"212.213.214.215"; xlatedport:"12443"; xlatedst:"192.168.10.26"; xlatesport:"0"; xlatesrc:"0.0.0.0"; ]

How do i extract or reference the IP addresses individually? %{IPV4} always matches the first one "192.168.87.252". I'd like to separately get dst:"123.124.125.126", nat_rulenum:"22", s_port:"14346" and src:"212.213.214.215".

I'd appreciate any help/tips.

thanks,
Sirjune

Well that's just what you have the filter plugins grok/dissect for, you'll have to create patterns to dig out the individual info components of your log events and name each of them, best as many as possible according the ECS scheme.

See more here

    dissect { mapping => { "message" => "%{[@metadata][timestamp1]} %{+[@metadata][timestamp1]} %{+[@metadata][timestamp1]} %{hostname} <%{pri}>1 %{[@metadata][timestamp2]} %{} [%{kvData}" } }
    mutate { gsub => [ "kvData", " ]$", "" ] }
    kv { field_split => ";" value_split => ":" source => "kvData" target => "kvStuff" trim_key => " " trim_value => "{}" }
    date { match => [ "[@metadata][timestamp2]", "ISO8601" ] }

will get you

   "kvStuff" => {
                  "rule" => "415",
                "origin" => "192.168.87.252",
              "xlatesrc" => "0.0.0.0",
                   "dst" => "123.124.125.126",
    "nat_addtnl_rulenum" => "1",
           "nat_rulenum" => "22",
                  "time" => "1569262139",
                 "proto" => "6",
                 "ifdir" => "inbound",
                 "flags" => "18692",
       "origin_sic_name" => "CN=zzfwi03,O=host103x.bogocity.com.",
       "__policy_id_tag" => "product=VPN-1 & FireWall-1[db_tag={EF33D65F-CD17-7A46-86C0-517C141DCAAF};mgmt=zzfwm01;date=1568814820;policy_name=BOGOcity]",
                "s_port" => "14346",
            "xlatedport" => "12443",
                "loguid" => "0x5d890a3b,0x12,0xfc5710ac,0xc0000001",

etc.

1 Like

thank you @stefws and @Badger! I came across dissect/kv yesterday but got stuck with kv delimeters that i have to use. @Badger, thanks again and I will try out the filter you noted.

@Badger worked like a charm! I replace kvStuff with fwStuff. My end goal here to is to use GeoIP. I added in the filter section:

filter {
grok {
match => { "message" => "%{TIMESTAMP_ISO8601:timestamp}" }
}

    dissect { mapping => { "message" => "%{[@metadata][timestamp1]} %{+[@metadata][timestamp1]} %{+[@metadata][timestamp1]} %{hostname} <%{pri}>1 %{[@metadata][timestamp2]} %{} [%{fwData}" } }
    mutate { gsub => [ "fwData", " ]$", "" ] }
    kv { field_split => ";" value_split => ":" source => "fwData" target => "fwStuff" trim_key => " " trim_value => "{}" }
    date { match => [ "[@metadata][timestamp2]", "ISO8601" ] }

geoip {
  source => "src"
}

}
output {
...
index => "logstash-testfwlog
...
}

document goes to ES but getting "_geoip_lookup_failure". I also tried source => "fwStuff.src"

I saw a similar config where geoip was defined after dissect (https://z0z0.me/logstash-using-dissect-instead-of-grok-for-filtering/).

Note that target is optional, if you omit it then fields will be created at the root level. Also, you might want to add

remove_field => [ "fwData" ]

to the kv filter so that if that field is parsed correctly then it will get removed, so that you do not index a second copy of the fields.

this one worked! @Badger: found this in some other posts you did.

geoip {
source => "[fwStuff][src]"
}

Many thanks to all!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.