Pfsense suricata fails to process alerts in elasticsearch via filebeat module

Using suricata 4.1.5 (eve json) from pfsense to redis -> file -> filebeat -> logstash -> elasticsearch
The alerts and some other event types are not showing up in the filebeat index. logstash is also 7.3.2 in this case due to issues I had using netflow in it, but since moving to filebeat netflow I can upgrade that now without impact if required. There are a number of other elasticsearch messages, but they are very similar to the example below but for other suricata prefixed field names.

Platform: CentOS 7
logstash 7.3.2
filebeat 7.4.1
elasticsearch 7.4.1

filebeat 7.4.1 modules.d/suricata.yml

- module: suricata
    enabled: true
    var.paths: ["/var/log/suricata/suricata-*.json"]

sample json:

{"@version":"1","timestamp":"2019-10-25T17:02:16.984898-0700","host":"pfsense-hostname","alert":{"signature":"ET DROP Dshield Block Listed Source group 1","category":"Misc Attack","rev":5342,"severity":2,"action":"allowed","signature_id":2402000,"metadata":{"tag":["Dshield"],"updated_at":["2019_10_24"],"affected_product":["Any"],"deployment":["Perimeter"],"created_at":["2010_12_30"],"attack_target":["Any"],"signature_severity":["Major"]},"gid":1},"src_port":44952,"flow":{"pkts_toclient":0,"bytes_toserver":60,"bytes_toclient":0,"start":"2019-10-25T17:02:16.984898-0700","pkts_toserver":1},"src_ip":"","dest_port":8140,"proto":"TCP","flow_id":214746435946306,"in_iface":"em3","dest_ip":"","metadata":{"flowbits":["ET.Evil","ET.DshieldIP"]},"event_type":"alert","@timestamp":"2019-10-26T00:02:17.294Z"}

A few other event types process (e.g. tls, dns, dhcp), but I don't see any other types such as alerts, http, or flows. I am getting errors in elasticsearch log as follows, but keep in mind I imported the index template (filebeat 7.4.1) and I am using ingress pipeline to add geo-ip data.

Caused by: java.lang.IllegalArgumentException: Cannot write to a field alias [suricata.eve.alert.severity].
[2019-10-26T03:12:24,560][DEBUG][o.e.a.b.TransportShardBulkAction] [elastichostname] [filebeat-7.4.1-2019.10.26][0] failed to execute bulk item (index) index {[filebeat-7.4.1-2019.10.26][_doc][8FmNB24BaZCyclMagY1t], source[n/a, actual length: [2.3kb], max length: 2kb]}
org.elasticsearch.index.mapper.MapperParsingException: failed to parse

Snip of Alias from template:

        "suricata" : {
          "properties" : {
            "eve" : {
              "properties" : {
                "alert" : {
                  "properties" : {
                    "severity" : {
                      "path" : "event.severity",
                      "type" : "alias"

Snip of event.severity definition from template:

        "event" : {
          "properties" : {
            "severity" : {
              "type" : "long"

I'm quite confused ... I have been digging around at various configurations in all layers and am left thinking this is something wrong with how elasticsearch deals with the alias or filebeat decode issue.

Eliminating logstash and sending direct to elasticsearch along with adding kibana connection seems to work, but I don't really understand why that matters.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.