Invalid Field Reference Logstash Grok

I'm receiving the error below but it doesn't offer a lot of information that I am able to tell. Anyone seen this before?

2022-06-17T07:45:19,655][WARN ][logstash.filters.grok    ][Cisco_ASA][694b9ac8fb3e2aae9d4c91962e716b9f2c831ad9475f8f862e96cdd9d8e515a9] Grok regexp threw exception {:message=>"Invalid FieldReference: `[source][user][name`", :exception=>RuntimeError, :backtrace=>["org/logstash/ext/JrubyEventExtLibrary.java:112:in `get'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-grok-4.4.2/lib/logstash/filters/grok.rb:426:in `handle'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-grok-4.4.2/lib/logstash/filters/grok.rb:386:in `block in match'", "(eval):21:in `block in compile_captures_func'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/jls-grok-0.11.5/lib/grok-pure.rb:202:in `capture'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-grok-4.4.2/lib/logstash/filters/grok.rb:386:in `block in match'", "org/jruby/RubyArray.java:1821:in `each'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-grok-4.4.2/lib/logstash/filters/grok.rb:381:in `match'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-grok-4.4.2/lib/logstash/filters/grok.rb:367:in `match_against_groks'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-grok-4.4.2/lib/logstash/filters/grok.rb:357:in `match'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-grok-4.4.2/lib/logstash/filters/grok.rb:301:in `block in filter'", "org/jruby/RubyHash.java:1415:in `each'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-grok-4.4.2/lib/logstash/filters/grok.rb:300:in `filter'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:159:in `do_filter'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:178:in `block in multi_filter'", "org/jruby/RubyArray.java:1821:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:175:in `multi_filter'", "org/logstash/config/ir/compiler/AbstractFilterDelegatorExt.java:134:in `multi_filter'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:300:in `block in start_workers'"]}

^^ Error code im seeing over and over again in the logs after going from 7.14 to 7.17.4 to 8.3.2.

Here's my .conf file -

input {
        udp {
                port => 10514
                type => "cisco-fw"
        }
}

filter {

if "ASA-6-434004" in [message] { drop{ } }
if "ASA-6-305012" in [message] { drop{ } }


        # Extract fields from the each of the detailed message types
        # The patterns provided below are included in core of LogStash 1.4.2.
        grok {
                match => [
                        "message", "%{CISCOFW106001}",
                        "message", "%{CISCOFW106006_106007_106010}",
                        "message", "%{CISCOFW106014}",
                        "message", "%{CISCOFW106015}",
                        "message", "%{CISCOFW106021}",
                        "message", "%{CISCOFW106023}",
                        "message", "%{CISCOFW106100}",
                        "message", "%{CISCOFW110002}",
                        "message", "%{CISCOFW302010}",
                        "message", "%{CISCOFW302013_302014_302015_302016}",
                        "message", "%{CISCOFW302020_302021}",
                        "message", "%{CISCOFW305011}",
                        "message", "%{CISCOFW313001_313004_313008}",
                        "message", "%{CISCOFW313005}",
                        "message", "%{CISCOFW402117}",
                        "message", "%{CISCOFW402119}",
                        "message", "%{CISCOFW419001}",
                        "message", "%{CISCOFW419002}",
                        "message", "%{CISCOFW500004}",
                        "message", "%{CISCOFW602303_602304}",
                        "message", "%{CISCOFW710001_710002_710003_710005_710006}",
                        "message", "%{CISCOFW713172}",
                        "message", "%{CISCOFW733100}"
                ]
        }

        grok {
                patterns_dir => ["/opt/logstash/patterns"]
                match => [
                        "message", "%{CISCOFWSFR434002_SFR}",
#                       "message", "%{CISCOFWVPN113019_DAY}",
                        "message", "%{CISCOFWVPN113019_VPN}",
                        "message", "%{CISCOLOGDETAILS}"
                        ]
        }
  mutate {
    convert => { "vpn_session_bytes_rcv" => "integer"
                 "vpn_session_bytes_xmt" => "integer"
 }
  }


geoip {
      source => "src_ip"
      target => "src_geoip"
      database => "/home/xxxx/GeoLite2-City.mmdb"
      add_field => [ "[src_geoip][coordinates]", "%{[src_geoip][longitude]}" ]
      add_field => [ "[src_geoip][coordinates]", "%{[src_geoip][latitude]}"  ]
    }
    mutate {
      convert => [ "[src_geoip][coordinates]", "float"]
    }
    # do GeoIP lookup for the ASN/ISP information.
    geoip {
      database => "/home/xxxx/GeoLite2-ASN.mmdb"
      source => "src_ip"
      target => "src_ip_asn"
    }

geoip {
      source => "dst_ip"
      target => "dst_geoip"
      database => "/home/xxxx/GeoLite2-City.mmdb"
      add_field => [ "[dst_geoip][coordinates]", "%{[dst_geoip][longitude]}" ]
      add_field => [ "[dst_geoip][coordinates]", "%{[dst_geoip][latitude]}"  ]
    }
    mutate {
      convert => [ "[dst_geoip][coordinates]", "float"]
    }
    # do GeoIP lookup for the ASN/ISP information.
    geoip {
      database => "/home/xxxx/GeoLite2-ASN.mmdb"
      source => "dst_ip"
      target => "dst_ip_asn"
    }
}
1 Like

It seems that the pattern is wrong, so it is a bug.

Looking into the patterns file, it is the pattern for CISCOFW302013_302014_302015_302016, the line is:

CISCOFW302013_302014_302015_302016 %{CISCO_ACTION:[cisco][asa][outcome]}(?: %{CISCO_DIRECTION:[cisco][asa][network][direction]})? %{WORD:[cisco][asa][network][transport]} connection %{INT:[cisco][asa][connection_id]} for %{NOTSPACE:[observer][ingress][interface][name]}:%{IP:[source][ip]}/%{INT:[source][port]:int}(?: \(%{IP:[source][nat][ip]}/%{INT:[source][nat][port]:int}\))?(?:\(%{DATA:[source][user][name?]}\))? to %{NOTSPACE:[observer][egress][interface][name]}:%{IP:[destination][ip]}/%{INT:[destination][port]:int}( \(%{IP:[destination][nat][ip]}/%{INT:[destination][nat][port]:int}\))?(?:\(%{DATA:[destination][user][name]}\))?( duration %{TIME:[cisco][asa][duration]} bytes %{INT:[network][bytes]:int})?(?: %{CISCO_REASON:[event][reason]})?(?: \(%{DATA:[user][name]}\))?

Note that the part to extract the field source.user.name is as below:

(?:\(%{DATA:[source][user][name?]}\))

It should be

(?:\(%{DATA:[source][user][name]}\))

You can temporarily fix it by editing this line in the file:

/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-patterns-core-4.3.3/patterns/ecs-v1/firewalls

The version for the gem logstash-patterns-core may be different in your system.

I can open a pull request with a fix so elastic can fix it in the next version.

1 Like

Yeah, I was able to replicate in version 8.2.3 using a sample message for ASA-6-302016, the change I mentioned in the patterns file worked.

The path for version 8.2.3 is:

$LOGSTASH_HOME/vendor/bundle/jruby/2.5.0/gems/logstash-patterns-core-4.3.3/patterns/ecs-v1/firewalls

I'm opening an issue and a PR to fix this.

1 Like

I will give it a try here, and if it works for me report back.

It seems that someone else already created in Github Issue.

1 Like