Netflow Cisco IOSv no data complet data in elasticsearch

Hello everyone,

I'm currently setting up NetFlow data collection on my machine, but I'm encountering some issues. I'm able to receive data in Filebeat, but I'm not getting all the NetFlow information I expect.

NetFlow / IPFIX Records

collected logs

{
  "_index": ".ds-filebeat-8.13.2-2024.04.19-000001",
  "_id": "ts50Mo8BDyQuREKbAF--",
  "_version": 1,
  "_score": null,
  "fields": {
    "netflow.type": [
      "netflow_options"
    ],
    "netflow.scope.octet_delta_count": [
      167837953
    ],
    "event.category": [
      "network"
    ],
    "service.type": [
      "netflow"
    ],
    "agent.type": [
      "filebeat"
    ],
    "netflow.exporter.source_id": [
      0
    ],
    "netflow.exporter.version": [
      9
    ],
    "event.module": [
      "netflow"
    ],
    "netflow.exporter.address": [
      "10.1.1.1:58427"
    ],
    "agent.name": [
      "adnane-virtual-machine"
    ],
    "network.direction": [
      "unknown"
    ],
    "event.kind": [
      "event"
    ],
    "netflow.exporter.uptime_millis": [
      15810024
    ],
    "netflow.exporter.timestamp": [
      "2024-05-01T04:39:34.000Z"
    ],
    "fileset.name": [
      "log"
    ],
    "input.type": [
      "netflow"
    ],
    "netflow.options.application_id": [
      3,
      0,
      0,
      70
    ],
    "agent.hostname": [
      "adnane-virtual-machine"
    ],
    "tags": [
      "forwarded"
    ],
    "netflow.options.application_name": [
      "gopher"
    ],
    "event.action": [
      "netflow_options"
    ],
    "event.ingested": [
      "2024-05-01T04:40:14.269Z"
    ],
    "@timestamp": [
      "2024-05-01T04:39:34.000Z"
    ],
    "agent.id": [
      "85dd7f5c-ead8-47a6-b3b8-46bf1f09b174"
    ],
    "ecs.version": [
      "1.12.0"
    ],
    "event.created": [
      "2024-05-01T04:40:05.283Z"
    ],
    "netflow.options.application_description": [
      "Internet Gopher protocol, online document management."
    ],
    "agent.ephemeral_id": [
      "b7156540-7d82-427e-90af-c550ce8d7ca5"
    ],
    "agent.version": [
      "8.13.2"
    ],
    "event.dataset": [
      "netflow.log"
    ],
    "observer.ip": [
      "10.1.1.1"
    ]
  },
  "sort": [
    "2024-05-01T04:39:34.000Z",
    93
  ]
}

Hi @Compte_Personnel

Please provide your filebeat config and the steps you took to enable the functionality.

What version are you on?

And did you look at the documentation to make sure you are using a supported

And finally, if you are using filebeat 8.13 I think there is a bug with the netflow try downgrading just the filebeat to 8.12 uninstall, reinstall, configure, run setup and report back

1 Like

Thank you for your answer mr @stephenb

I'm using filebeat 8.13.2 installed on Ubuntu 22.04.4 LTS, to send netflow data to Elastic Stack version 8.13.2

I believe that some data may have passed, but not everything, such as source.ip and destination.ip, so there may be a bug in the new version as you mentioned ?

/etc/filebeat/modules.d/netflow.yml

# Module: netflow
# Docs: https://www.elastic.co/guide/en/beats/filebeat/8.13/filebeat-module-netflow.html

- module: netflow
  log:
    enabled: true
    var:
      netflow_host: 0.0.0.0
      netflow_port: 2055
      # internal_networks specifies which networks are considered internal or private
      # you can specify either a CIDR block or any of the special named ranges listed
      # at: https://www.elastic.co/guide/en/beats/filebeat/current/defining-processors.html#condition-network
      

/etc/filebeat/filebeat.yml

###################### Filebeat Configuration Example #########################

# This file is an example configuration file highlighting only the most common
# options. The filebeat.reference.yml file from the same directory contains all the
# supported options with more comments. You can use it as a reference.
#
# You can find the full configuration reference here:
# https://www.elastic.co/guide/en/beats/filebeat/index.html

# For more available modules and options, please see the filebeat.reference.yml sample
# configuration file.

# ============================== Filebeat inputs ===============================

filebeat.inputs:

# Each - is an input. Most options can be set at the input level, so
# you can use different inputs for various configurations.
# Below are the input-specific configurations.

# filestream is an input for collecting log messages from files.
- type: filestream
  # Unique ID among all inputs, an ID is required.
  id: my-filestream-id

  # Change to true to enable this input configuration.
  enabled: false

  # Paths that should be crawled and fetched. Glob based paths.
  paths:
    - /var/log/*.log
    #- c:\programdata\elasticsearch\lo

  # Exclude lines. A list of regular expressions to match. It drops the lines that are
  # matching any regular expression from the list.
  # Line filtering happens after the parsers pipeline. If you would like to filter lines
  # before parsers, use include_message parser.
  #exclude_lines: ['^DBG']

  # Include lines. A list of regular expressions to match. It exports the lines that are
  # matching any regular expression from the list.
  # Line filtering happens after the parsers pipeline. If you would like to filter lines
  # before parsers, use include_message parser.
  #include_lines: ['^ERR', '^WARN']

  # Exclude files. A list of regular expressions to match. Filebeat drops the files that
  # are matching any regular expression from the list. By default, no files are dropped.
  #prospector.scanner.exclude_files: ['.gz$']

  # Optional additional fields. These fields can be freely picked
  # to add additional information to the crawled log files for filtering
  #fields:
  #  level: debug
  #  review: 1

# ============================== Filebeat modules ==============================

filebeat.config.modules:
  # Glob pattern for configuration loading
  path: ${path.config}/modules.d/*.yml

  # Set to true to enable config reloading
  reload.enabled: true

  # Period on which files under path should be checked for changes
  #reload.period: 10s

# ======================= Elasticsearch template setting =======================
setup.template.settings:
  index.number_of_shards: 1
  #index.codec: best_compression
  #_source.enabled: false


# ================================== General ===================================

# The name of the shipper that publishes the network data. It can be used to group
# all the transactions sent by a single shipper in the web interface.
#name:

# The tags of the shipper are included in their field with each
# transaction published.
#tags: ["service-X", "web-tier"]

# Optional fields that you can specify to add additional information to the
# output.
#fields:
#  env: staging


# =================================== Kibana ===================================

# Starting with Beats version 6.0.0, the dashboards are loaded via the Kibana API.
# This requires a Kibana endpoint configuration.
setup.kibana:

  # Kibana Host
  # Scheme and port can be left out and will be set to the default (http and 5601)
  # In case you specify and additional path, the scheme is required: http://localhost:5601/path
  # IPv6 addresses should always be defined as: https://[2001:db8::1]:5601
  host: "0.0.0.0:5601"

  # Kibana Space ID
  # ID of the Kibana Space into which the dashboards should be loaded. By default,
  # the Default Space will be used.
  #space.id:


# ================================== Outputs ===================================

# Configure what output to use when sending the data collected by the beat.

# ---------------------------- Elasticsearch Output ----------------------------
output.elasticsearch:
  # Array of hosts to connect to.
  hosts: ["0.0.0.0:9200"]
  # Performance preset - one of "balanced", "throughput", "scale",
  # "latency", or "custom".
  preset: balanced

  # Protocol - either `http` (default) or `https`.
  #protocol: "https"

  # Authentication credentials - either API key or username/password.
  #api_key: "id:api_key"
  #username: "elastic"
  #password: "changeme"

# ================================= Processors =================================
processors:
  - add_host_metadata:
      when.not.contains.tags: forwarded
  - add_cloud_metadata: ~
  - add_docker_metadata: ~
  - add_kubernetes_metadata: ~

Try filebeat 8.12.x
Makesure to configure and run setup
You can leave Elasticsearch 8.13.x

1 Like

Thank you so much, Mr. Stephen Brown, for your assistance. I truly appreciate your help and expertise!

1 Like