Problems indexing events into Elasticsearch

Hello everyone,

It seems there are issues with indexing events into Elasticsearch

● logstash.service - logstash
     Loaded: loaded (/lib/systemd/system/logstash.service; enabled; vendor preset: enabled)
     Active: active (running) since Wed 2024-05-08 17:11:17 +01; 48s ago
   Main PID: 7714 (java)
      Tasks: 81 (limit: 9386)
     Memory: 791.6M
        CPU: 2min 6.558s
     CGroup: /system.slice/logstash.service
             └─7714 /usr/share/logstash/jdk/bin/java -Xms1g -Xmx1g -Djava.awt.headless=true -Dfile.encoding=UTF-8 -Djruby.compile.invokedynamic=true -XX:+HeapDumpOnOutOfMemoryError -Djava.security.egd=file:/dev/urandom -Dlog4j2.isThreadContextMapInheritable=true -Dlogstash.jackson.stream-read-constraints.max-string-length=200000000 -Dlogstash.jackson.stream-read-constraints.max-number-length=10000 -Djruby.regexp.interruptible=true -Djdk.io.File.enableADS=true --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED --add-opens=java.bas>

ماي 08 17:11:52 adnane-virtual-machine logstash[7714]: [2024-05-08T17:11:52,083][WARN ][logstash.outputs.elasticsearch][main][45d9e59ef8366f15f02073ad413193ea18fe5176dfda287b2cbd39e702745f2a] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"netflow-2024.05.08", :routing=>nil}, {"netflow"=>{"options"=>{"application_id"=>[13, 0, 0, 61], "application_description"=>"Real Time Protocol", "application_name"=>"rtp"}, "exporter"=>{"version"=>9, "uptime_millis"=>8279456, "address"=>"10.1.1.1:56303", "source_id"=>0, "timestamp"=>"2024-05-08T16:07:06.000Z"}, "scope"=>{"octet_delta_count"=>167837953}, "type"=>"netflow_options"}, "event"=>{"dataset"=>"netflow.log", "action"=>"netflow_options", "category"=>["network"], "created"=>"2024-05-08T16:11:30.630Z", "mod>
ماي 08 17:11:52 adnane-virtual-machine logstash[7714]: [2024-05-08T17:11:52,083][WARN ][logstash.outputs.elasticsearch][main][45d9e59ef8366f15f02073ad413193ea18fe5176dfda287b2cbd39e702745f2a] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"netflow-2024.05.08", :routing=>nil}, {"netflow"=>{"type"=>"netflow_options", "exporter"=>{"uptime_millis"=>8279460, "address"=>"10.1.1.1:56303", "version"=>9, "timestamp"=>"2024-05-08T16:07:06.000Z", "source_id"=>0}, "scope"=>{"octet_delta_count"=>167837953}, "options"=>{"application_id"=>[3, 0, 0, 179], "application_description"=>"Border Gateway Protocol", "application_name"=>"bgp"}}, "event"=>{"dataset"=>"netflow.log", "action"=>"netflow_options", "category"=>["network"], "kind"=>"event", "module"=>"netflow", >
ماي 08 17:11:52 adnane-virtual-machine logstash[7714]: [2024-05-08T17:11:52,078][WARN ][logstash.outputs.elasticsearch][main][45d9e59ef8366f15f02073ad413193ea18fe5176dfda287b2cbd39e702745f2a] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"netflow-2024.05.08", :routing=>nil}, {"netflow"=>{"type"=>"netflow_options", "exporter"=>{"version"=>9, "uptime_millis"=>8279456, "address"=>"10.1.1.1:56303", "timestamp"=>"2024-05-08T16:07:06.000Z", "source_id"=>0}, "scope"=>{"octet_delta_count"=>167837953}, "options"=>{"application_id"=>[13, 0, 0, 83], "application_description"=>"Skype Peer-to-Peer Internet Telephony Protocol", "application_name"=>"skype"}}, "event"=>{"dataset"=>"netflow.log", "action"=>"netflow_options", "category"=>["network"], "created"=>"2>
ماي 08 17:11:52 adnane-virtual-machine logstash[7714]: [2024-05-08T17:11:52,084][WARN ][logstash.outputs.elasticsearch][main][45d9e59ef8366f15f02073ad413193ea18fe5176dfda287b2cbd39e702745f2a] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"netflow-2024.05.08", :routing=>nil}, {"netflow"=>{"type"=>"netflow_options", "exporter"=>{"version"=>9, "uptime_millis"=>8279460, "address"=>"10.1.1.1:56303", "source_id"=>0, "timestamp"=>"2024-05-08T16:07:06.000Z"}, "scope"=>{"octet_delta_count"=>167837953}, "options"=>{"application_id"=>[1, 0, 0, 47], "application_description"=>"General Routing Encapsulation", "application_name"=>"gre"}}, "event"=>{"dataset"=>"netflow.log", "action"=>"netflow_options", "category"=>["network"], "created"=>"2024-05-08T16:11:30.6>
ماي 08 17:11:52 adnane-virtual-machine logstash[7714]: [2024-05-08T17:11:52,086][WARN ][logstash.outputs.elasticsearch][main][45d9e59ef8366f15f02073ad413193ea18fe5176dfda287b2cbd39e702745f2a] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"netflow-2024.05.08", :routing=>nil}, {"netflow"=>{"options"=>{"application_id"=>[13, 0, 1, 169], "application_description"=>"VDOLive streaming video", "application_name"=>"vdolive"}, "exporter"=>{"uptime_millis"=>8279460, "address"=>"10.1.1.1:56303", "version"=>9, "source_id"=>0, "timestamp"=>"2024-05-08T16:07:06.000Z"}, "scope"=>{"octet_delta_count"=>167837953}, "type"=>"netflow_options"}, "event"=>{"dataset"=>"netflow.log", "action"=>"netflow_options", "category"=>["network"], "created"=>"2024-05-08T16:11:30.6>
ماي 08 17:11:52 adnane-virtual-machine logstash[7714]: [2024-05-08T17:11:52,083][WARN ][logstash.outputs.elasticsearch][main][45d9e59ef8366f15f02073ad413193ea18fe5176dfda287b2cbd39e702745f2a] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"netflow-2024.05.08", :routing=>nil}, {"netflow"=>{"type"=>"netflow_options", "exporter"=>{"uptime_millis"=>8280462, "address"=>"10.1.1.1:56303", "version"=>9, "source_id"=>0, "timestamp"=>"2024-05-08T16:07:07.000Z"}, "scope"=>{"octet_delta_count"=>167837953}, "options"=>{"application_id"=>[3, 0, 0, 79], "application_description"=>"Finger", "application_name"=>"finger"}}, "event"=>{"dataset"=>"netflow.log", "action"=>"netflow_options", "category"=>["network"], "created"=>"2024-05-08T16:11:31.641Z", "kind"=>"event>
ماي 08 17:11:52 adnane-virtual-machine logstash[7714]: [2024-05-08T17:11:52,087][WARN ][logstash.outputs.elasticsearch][main][45d9e59ef8366f15f02073ad413193ea18fe5176dfda287b2cbd39e702745f2a] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"netflow-2024.05.08", :routing=>nil}, {"netflow"=>{"type"=>"netflow_options", "exporter"=>{"address"=>"10.1.1.1:56303", "version"=>9, "uptime_millis"=>8279460, "source_id"=>0, "timestamp"=>"2024-05-08T16:07:06.000Z"}, "scope"=>{"octet_delta_count"=>167837953}, "options"=>{"application_id"=>[13, 0, 0, 113], "application_description"=>"telepresence-media stream", "application_name"=>"telepresence-media"}}, "event"=>{"dataset"=>"netflow.log", "action"=>"netflow_options", "category"=>["network"], "created"=>"2024-05->
ماي 08 17:11:52 adnane-virtual-machine logstash[7714]: [2024-05-08T17:11:52,089][WARN ][logstash.outputs.elasticsearch][main][45d9e59ef8366f15f02073ad413193ea18fe5176dfda287b2cbd39e702745f2a] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"netflow-2024.05.08", :routing=>nil}, {"netflow"=>{"type"=>"netflow_options", "exporter"=>{"address"=>"10.1.1.1:56303", "version"=>9, "uptime_millis"=>8279460, "source_id"=>0, "timestamp"=>"2024-05-08T16:07:06.000Z"}, "scope"=>{"octet_delta_count"=>167837953}, "options"=>{"application_id"=>[1, 0, 0, 89], "application_description"=>"Open Shortest Path First", "application_name"=>"ospf"}}, "event"=>{"dataset"=>"netflow.log", "action"=>"netflow_options", "category"=>["network"], "created"=>"2024-05-08T16:11:30.636Z">
ماي 08 17:11:52 adnane-virtual-machine logstash[7714]: [2024-05-08T17:11:52,089][WARN ][logstash.outputs.elasticsearch][main][45d9e59ef8366f15f02073ad413193ea18fe5176dfda287b2cbd39e702745f2a] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"netflow-2024.05.08", :routing=>nil}, {"netflow"=>{"type"=>"netflow_options", "exporter"=>{"version"=>9, "uptime_millis"=>8280462, "address"=>"10.1.1.1:56303", "source_id"=>0, "timestamp"=>"2024-05-08T16:07:07.000Z"}, "scope"=>{"octet_delta_count"=>167837953}, "options"=>{"application_id"=>[13, 0, 0, 12], "application_description"=>"CU-SeeMe desktop video conference", "application_name"=>"cuseeme"}}, "event"=>{"dataset"=>"netflow.log", "action"=>"netflow_options", "category"=>["network"], "kind"=>"event", "module>
ماي 08 17:11:52 adnane-virtual-machine logstash[7714]: [2024-05-08T17:11:52,090][WARN ][logstash.outputs.elasticsearch][main][45d9e59ef8366f15f02073ad413193ea18fe5176dfda287b2cbd39e702745f2a] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"netflow-2024.05.08", :routing=>nil}, {"netflow"=>{"options"=>{"application_id"=>[3, 0, 0, 70], "application_description"=>"Internet Gopher protocol, online document management.", "application_name"=>"gopher"}, "exporter"=>{"address"=>"10.1.1.1:56303", "version"=>9, "uptime_millis"=>8280462, "source_id"=>0, "timestamp"=>"2024-05-08T16:07:07.000Z"}, "scope"=>{"octet_delta_count"=>167837953}, "type"=>"netflow_options"}, "event"=>{"dataset"=>"netflow.log", "action"=>"netflow_options", "category"=>["network"], "kind">
# Module: netflow
# Docs: https://www.elastic.co/guide/en/beats/filebeat/8.12/filebeat-module-netflow.html

- module: netflow
  log:
    enabled: true
    var:
      netflow_host: 0.0.0.0
      netflow_port: 2055
      # internal_networks specifies which networks are considered internal or private
      # you can specify either a CIDR block or any of the special named ranges listed
      # at: https://www.elastic.co/guide/en/beats/filebeat/current/defining-processors.html#condition-network

/etc/filebeat/modules.d/netflow.yml

input {
  beats {
    port => 5044
    type => "netflow"
  }
}

output {
    if [type] == "netflow" {
        elasticsearch {
            hosts => ["0.0.0.0:9200"]
            index => "netflow-%{+YYYY.MM.dd}"
        }
    }  
}

/etc/filebeat/filebeat.yml

###################### Filebeat Configuration Example #########################

# This file is an example configuration file highlighting only the most common
# options. The filebeat.reference.yml file from the same directory contains all the
# supported options with more comments. You can use it as a reference.
#
# You can find the full configuration reference here:
# https://www.elastic.co/guide/en/beats/filebeat/index.html

# For more available modules and options, please see the filebeat.reference.yml sample
# configuration file.

# ============================== Filebeat inputs ===============================

filebeat.inputs:

# Each - is an input. Most options can be set at the input level, so
# you can use different inputs for various configurations.
# Below are the input-specific configurations.

# filestream is an input for collecting log messages from files.
- type: filestream
  # Unique ID among all inputs, an ID is required.
  id: my-filestream-id

  # Change to true to enable this input configuration.
  enabled: false

  # Paths that should be crawled and fetched. Glob based paths.
  paths:
    - /var/log/*.log
    #- c:\programdata\elasticsearch\lo

  # Exclude lines. A list of regular expressions to match. It drops the lines that are
  # matching any regular expression from the list.
  # Line filtering happens after the parsers pipeline. If you would like to filter lines
  # before parsers, use include_message parser.
  #exclude_lines: ['^DBG']

  # Include lines. A list of regular expressions to match. It exports the lines that are
  # matching any regular expression from the list.
  # Line filtering happens after the parsers pipeline. If you would like to filter lines
  # before parsers, use include_message parser.
  #include_lines: ['^ERR', '^WARN']

  # Exclude files. A list of regular expressions to match. Filebeat drops the files that
  # are matching any regular expression from the list. By default, no files are dropped.
  #prospector.scanner.exclude_files: ['.gz$']

  # Optional additional fields. These fields can be freely picked
  # to add additional information to the crawled log files for filtering
  #fields:
  #  level: debug
  #  review: 1

# ============================== Filebeat modules ==============================

filebeat.config.modules:
  # Glob pattern for configuration loading
  path: ${path.config}/modules.d/*.yml

  # Set to true to enable config reloading
  reload.enabled: true

  # Period on which files under path should be checked for changes
  #reload.period: 10s

# ======================= Elasticsearch template setting =======================
setup.template.settings:
  index.number_of_shards: 1
  #index.codec: best_compression
  #_source.enabled: false


# ================================== General ===================================

# The name of the shipper that publishes the network data. It can be used to group
# all the transactions sent by a single shipper in the web interface.
#name:

# The tags of the shipper are included in their field with each
# transaction published.
#tags: ["service-X", "web-tier"]

# Optional fields that you can specify to add additional information to the
# output.
#fields:
#  env: staging


# =================================== Kibana ===================================

# Starting with Beats version 6.0.0, the dashboards are loaded via the Kibana API.
# This requires a Kibana endpoint configuration.
setup.kibana:

  # Kibana Host
  # Scheme and port can be left out and will be set to the default (http and 5601)
  # In case you specify and additional path, the scheme is required: http://localhost:5601/path
  # IPv6 addresses should always be defined as: https://[2001:db8::1]:5601
  host: "0.0.0.0:5601"

  # Kibana Space ID
  # ID of the Kibana Space into which the dashboards should be loaded. By default,
  # the Default Space will be used.
  #space.id:


# ================================== Outputs ===================================

# Configure what output to use when sending the data collected by the beat.

# ---------------------------- Elasticsearch Output ----------------------------
#output.elasticsearch:
  # Array of hosts to connect to.
  #hosts: ["0.0.0.0:9200"]
  # Performance preset - one of "balanced", "throughput", "scale",
  # "latency", or "custom".
  preset: balanced

  # Protocol - either `http` (default) or `https`.
  #protocol: "https"

  # Authentication credentials - either API key or username/password.
  #api_key: "id:api_key"
  #username: "elastic"
  #password: "changeme"

output.logstash:
  # The Logstash hosts
  hosts: ["0.0.0.0:5044"]



# ================================= Processors =================================
processors:
  - add_host_metadata:
      when.not.contains.tags: forwarded
  - add_cloud_metadata: ~
  - add_docker_metadata: ~
  - add_kubernetes_metadata: ~

Hello,

You need to share the logstash full log error, what you shared does not show the reason for the error.

Check the logstash log file in /var/log/logstash/logstash-plain.log and share a sample of the log errors.

1 Like

Set:

output.logstash:
  hosts: ["localhost:5044"]

and comment or remove it:
# preset: balanced

and change

setup.kibana:
  host: "localhost:5601"
1 Like

Hello @leandrojmp

Thank you very much for your message and for providing the information. I apologize for the earlier confusion. Here is the complete Logstash log file you requested:

[2024-05-11T19:07:41,700][WARN ][logstash.runner          ] SIGTERM received. Shutting down.
[2024-05-11T19:07:48,861][INFO ][logstash.javapipeline    ][main] Pipeline terminated {"pipeline.id"=>"main"}
[2024-05-11T19:07:49,866][INFO ][logstash.pipelinesregistry] Removed pipeline from registry successfully {:pipeline_id=>:main}
[2024-05-11T19:07:49,873][INFO ][logstash.runner          ] Logstash shut down.
[2024-05-11T19:08:11,734][INFO ][logstash.runner          ] Log4j configuration path used is: /etc/logstash/log4j2.properties
[2024-05-11T19:08:11,742][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"8.13.1", "jruby.version"=>"jruby 9.4.5.0 (3.1.4) 2023-11-02 1abae2700f OpenJDK 64-Bit Server VM 17.0.10+7 on 17.0.10+7 +indy +jit [x86_64-linux]"}
[2024-05-11T19:08:11,747][INFO ][logstash.runner          ] JVM bootstrap flags: [-Xms1g, -Xmx1g, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Dlogstash.jackson.stream-read-constraints.max-string-length=200000000, -Dlogstash.jackson.stream-read-constraints.max-number-length=10000, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED, -Dio.netty.allocator.maxOrder=11]
[2024-05-11T19:08:11,751][INFO ][logstash.runner          ] Jackson default value override `logstash.jackson.stream-read-constraints.max-string-length` configured to `200000000`
[2024-05-11T19:08:11,751][INFO ][logstash.runner          ] Jackson default value override `logstash.jackson.stream-read-constraints.max-number-length` configured to `10000`
[2024-05-11T19:08:12,943][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
[2024-05-11T19:08:13,410][INFO ][org.reflections.Reflections] Reflections took 170 ms to scan 1 urls, producing 132 keys and 468 values
[2024-05-11T19:08:13,888][INFO ][logstash.javapipeline    ] Pipeline `main` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
[2024-05-11T19:08:13,910][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//0.0.0.0:9200"]}
[2024-05-11T19:08:14,054][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://0.0.0.0:9200/]}}
[2024-05-11T19:08:14,198][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://0.0.0.0:9200/"}
[2024-05-11T19:08:14,200][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch version determined (8.13.2) {:es_version=>8}
[2024-05-11T19:08:14,200][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>8}
[2024-05-11T19:08:14,262][INFO ][logstash.outputs.elasticsearch][main] Not eligible for data streams because config contains one or more settings that are not compatible with data streams: {"index"=>"netflow-test-%{+YYYY.MM.dd}"}
[2024-05-11T19:08:14,264][INFO ][logstash.outputs.elasticsearch][main] Data streams auto configuration (`data_stream => auto` or unset) resolved to `false`
[2024-05-11T19:08:14,283][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>8, :ecs_compatibility=>:v8}
[2024-05-11T19:08:14,308][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1000, "pipeline.sources"=>["/etc/logstash/conf.d/netflow.conf"], :thread=>"#<Thread:0x236a567c /usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:134 run>"}
[2024-05-11T19:08:15,282][INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>0.97}
[2024-05-11T19:08:15,293][INFO ][logstash.inputs.beats    ][main] Starting input listener {:address=>"0.0.0.0:5044"}
[2024-05-11T19:08:15,305][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2024-05-11T19:08:15,330][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2024-05-11T19:08:15,418][INFO ][org.logstash.beats.Server][main][bd3f1d21fbbc0af589f647c1040f22a581163d49e95e97d86e0927972c404f1f] Starting server on port: 5044
[2024-05-11T19:08:21,664][WARN ][logstash.outputs.elasticsearch][main][f272e2516d7fddc8f4ce751d6466200f44fa68bed5e6eeee77c3303ab14a15ec] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"netflow-test-2024.05.11", :routing=>nil}, {"input"=>{"type"=>"netflow"}, "tags"=>["forwarded", "beats_input_raw_event"], "related"=>{"ip"=>["10.1.10.254", "34.120.177.193"]}, "@timestamp"=>2024-05-11T18:08:22.000Z, "flow"=>{"id"=>"t21k1QaD8iQ", "locality"=>"external"}, "type"=>"netflow", "netflow"=>{"packet_delta_count"=>1, "source_transport_port"=>39856, "flow_start_sys_up_time"=>1288466, "type"=>"netflow_flow", "octet_delta_count"=>40, "destination_transport_port"=>443, "ip_class_of_service"=>0, "source_ipv4_address"=>"10.1.10.254", "sampler_id"=>0, "ingress_interface"=>1, "flow_direction"=>0, "flow_end_sys_up_time"=>1288466, "destination_ipv4_address"=>"34.120.177.193", "protocol_identifier"=>6, "bgp_source_as_number"=>0, "bgp_destination_as_number"=>0, "ip_next_hop_ipv4_address"=>"192.168.122.1", "class_id"=>0, "exporter"=>{"version"=>9, "address"=>"10.1.1.1:49919", "source_id"=>0, "uptime_millis"=>1310363, "timestamp"=>"2024-05-11T18:08:22.000Z"}, "source_ipv4_prefix_length"=>16, "destination_ipv4_prefix_length"=>0, "tcp_control_bits"=>16, "egress_interface"=>2}, "fileset"=>{"name"=>"log"}, "destination"=>{"ip"=>"34.120.177.193", "port"=>443, "locality"=>"external"}, "@version"=>"1", "agent"=>{"ephemeral_id"=>"59904c33-a69c-4b1f-95e9-e1e6f85ba341", "id"=>"d2d378e8-5a49-446c-9585-15c07c89ada9", "version"=>"8.12.2", "name"=>"adnane-virtual-machine", "type"=>"filebeat"}, "observer"=>{"ip"=>"10.1.1.1"}, "event"=>{"start"=>"2024-05-11T18:08:00.103Z", "duration"=>0, "action"=>"netflow_flow", "type"=>["connection"], "dataset"=>"netflow.log", "category"=>["network"], "module"=>"netflow", "created"=>"2024-05-11T18:08:03.196Z", "kind"=>"event", "end"=>"2024-05-11T18:08:00.103Z"}, "network"=>{"transport"=>"tcp", "packets"=>1, "community_id"=>"1:yxP7M7CDnLWmOnYuXnM5Oxuui9Q=", "iana_number"=>6, "bytes"=>40, "direction"=>"inbound"}, "ecs"=>{"version"=>"1.12.0"}, "service"=>{"type"=>"netflow"}, "source"=>{"ip"=>"10.1.10.254", "packets"=>1, "port"=>39856, "locality"=>"internal", "bytes"=>40}}], :response=>{"index"=>{"status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"only write ops with an op_type of create are allowed in data streams", "caused_by"=>{"type"=>"index_not_found_exception", "reason"=>"no such index [netflow-test-2024.05.11]", "resource.type"=>"index_or_alias", "excluded_ds"=>"true", "resource.id"=>"netflow-test-2024.05.11", "index_uuid"=>"_na_", "index"=>"netflow-test-2024.05.11"}}}}}
[2024-05-11T19:08:21,664][WARN ][logstash.outputs.elasticsearch][main][f272e2516d7fddc8f4ce751d6466200f44fa68bed5e6eeee77c3303ab14a15ec] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"netflow-test-2024.05.11", :routing=>nil}, {"input"=>{"type"=>"netflow"}, "tags"=>["forwarded", "beats_input_raw_event"], "related"=>{"ip"=>["10.1.10.254", "34.95.113.255"]}, "@timestamp"=>2024-05-11T18:08:22.000Z, "flow"=>{"id"=>"mJ6vBMl2hE4", "locality"=>"external"}, "type"=>"netflow", "netflow"=>{"packet_delta_count"=>2, "source_transport_port"=>41822, "flow_start_sys_up_time"=>1292321, "type"=>"netflow_flow", "octet_delta_count"=>119, "destination_transport_port"=>443, "ip_class_of_service"=>0, "sampler_id"=>0, "source_ipv4_address"=>"10.1.10.254", "ingress_interface"=>1, "flow_direction"=>0, "flow_end_sys_up_time"=>1292444, "destination_ipv4_address"=>"34.95.113.255", "protocol_identifier"=>6, "bgp_source_as_number"=>0, "bgp_destination_as_number"=>0, "ip_next_hop_ipv4_address"=>"192.168.122.1", "class_id"=>0, "exporter"=>{"source_id"=>0, "version"=>9, "address"=>"10.1.1.1:49919", "uptime_millis"=>1310363, "timestamp"=>"2024-05-11T18:08:22.000Z"}, "source_ipv4_prefix_length"=>16, "destination_ipv4_prefix_length"=>0, "tcp_control_bits"=>24, "egress_interface"=>2}, "fileset"=>{"name"=>"log"}, "destination"=>{"ip"=>"34.95.113.255", "port"=>443, "locality"=>"external"}, "@version"=>"1", "agent"=>{"version"=>"8.12.2", "ephemeral_id"=>"59904c33-a69c-4b1f-95e9-e1e6f85ba341", "id"=>"d2d378e8-5a49-446c-9585-15c07c89ada9", "name"=>"adnane-virtual-machine", "type"=>"filebeat"}, "observer"=>{"ip"=>"10.1.1.1"}, "event"=>{"start"=>"2024-05-11T18:08:03.958Z", "duration"=>123000000, "action"=>"netflow_flow", "category"=>["network"], "dataset"=>"netflow.log", "type"=>["connection"], "module"=>"netflow", "created"=>"2024-05-11T18:08:03.196Z", "end"=>"2024-05-11T18:08:04.081Z", "kind"=>"event"}, "network"=>{"transport"=>"tcp", "packets"=>2, "community_id"=>"1:iZKQadMhUg8olLX86OLMzwtdgoo=", "iana_number"=>6, "bytes"=>119, "direction"=>"inbound"}, "ecs"=>{"version"=>"1.12.0"}, "service"=>{"type"=>"netflow"}, "source"=>{"packets"=>2, "ip"=>"10.1.10.254", "port"=>41822, "locality"=>"internal", "bytes"=>119}}], :response=>{"index"=>{"status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"only write ops with an op_type of create are allowed in data streams", "caused_by"=>{"type"=>"index_not_found_exception", "reason"=>"no such index [netflow-test-2024.05.11]", "resource.type"=>"index_or_alias", "excluded_ds"=>"true", "resource.id"=>"netflow-test-2024.05.11", "index_uuid"=>"_na_", "index"=>"netflow-test-2024.05.11"}}}}}
[2024-05-11T19:08:21,664][WARN ][logstash.outputs.elasticsearch][main][f272e2516d7fddc8f4ce751d6466200f44fa68bed5e6eeee77c3303ab14a15ec] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"netflow-test-2024.05.11", :routing=>nil}, {"input"=>{"type"=>"netflow"}, "tags"=>["forwarded", "beats_input_raw_event"], "related"=>{"ip"=>["10.1.10.254", "34.120.177.193"]}, "@timestamp"=>2024-05-11T18:07:53.000Z, "flow"=>{"id"=>"t21k1QaD8iQ", "locality"=>"external"}, "type"=>"netflow", "netflow"=>{"packet_delta_count"=>1, "source_transport_port"=>39856, "flow_start_sys_up_time"=>1257795, "type"=>"netflow_flow", "octet_delta_count"=>40, "destination_transport_port"=>443, "ip_class_of_service"=>0, "sampler_id"=>0, "source_ipv4_address"=>"10.1.10.254", "ingress_interface"=>1, "flow_direction"=>0, "flow_end_sys_up_time"=>1257795, "destination_ipv4_address"=>"34.120.177.193", "protocol_identifier"=>6, "bgp_source_as_number"=>0, "bgp_destination_as_number"=>0, "ip_next_hop_ipv4_address"=>"192.168.122.1", "class_id"=>0, "exporter"=>{"address"=>"10.1.1.1:49919", "source_id"=>0, "version"=>9, "uptime_millis"=>1281364, "timestamp"=>"2024-05-11T18:07:53.000Z"}, "source_ipv4_prefix_length"=>16, "destination_ipv4_prefix_length"=>0, "tcp_control_bits"=>16, "egress_interface"=>2}, "fileset"=>{"name"=>"log"}, "destination"=>{"ip"=>"34.120.177.193", "port"=>443, "locality"=>"external"}, "@version"=>"1", "agent"=>{"version"=>"8.12.2", "ephemeral_id"=>"59904c33-a69c-4b1f-95e9-e1e6f85ba341", "id"=>"d2d378e8-5a49-446c-9585-15c07c89ada9", "name"=>"adnane-virtual-machine", "type"=>"filebeat"}, "observer"=>{"ip"=>"10.1.1.1"}, "network"=>{"transport"=>"tcp", "packets"=>1, "community_id"=>"1:yxP7M7CDnLWmOnYuXnM5Oxuui9Q=", "iana_number"=>6, "bytes"=>40, "direction"=>"inbound"}, "event"=>{"start"=>"2024-05-11T18:07:29.431Z", "duration"=>0, "action"=>"netflow_flow", "category"=>["network"], "type"=>["connection"], "dataset"=>"netflow.log", "module"=>"netflow", "created"=>"2024-05-11T18:07:34.111Z", "kind"=>"event", "end"=>"2024-05-11T18:07:29.431Z"}, "ecs"=>{"version"=>"1.12.0"}, "service"=>{"type"=>"netflow"}, "source"=>{"packets"=>1, "ip"=>"10.1.10.254", "port"=>39856, "locality"=>"internal", "bytes"=>40}}], :response=>{"index"=>{"status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"only write ops with an op_type of create are allowed in data streams", "caused_by"=>{"type"=>"index_not_found_exception", "reason"=>"no such index [netflow-test-2024.05.11]", "resource.type"=>"index_or_alias", "excluded_ds"=>"true", "resource.id"=>"netflow-test-2024.05.11", "index_uuid"=>"_na_", "index"=>"netflow-test-2024.05.11"}}}}}
[2024-05-11T19:08:21,665][WARN ][logstash.outputs.elasticsearch][main][f272e2516d7fddc8f4ce751d6466200f44fa68bed5e6eeee77c3303ab14a15ec] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"netflow-test-2024.05.11", :routing=>nil}, {"input"=>{"type"=>"netflow"}, "tags"=>["forwarded", "beats_input_raw_event"], "related"=>{"ip"=>["10.1.10.254", "34.107.243.93"]}, "@timestamp"=>2024-05-11T18:08:22.000Z, "flow"=>{"id"=>"5d5rtdUfu-I", "locality"=>"external"}, "type"=>"netflow", "netflow"=>{"packet_delta_count"=>1, "source_transport_port"=>41256, "flow_start_sys_up_time"=>1283614, "type"=>"netflow_flow", "octet_delta_count"=>68, "destination_transport_port"=>443, "ip_class_of_service"=>0, "source_ipv4_address"=>"10.1.10.254", "sampler_id"=>0, "ingress_interface"=>1, "flow_direction"=>0, "flow_end_sys_up_time"=>1283614, "destination_ipv4_address"=>"34.107.243.93", "protocol_identifier"=>6, "bgp_source_as_number"=>0, "bgp_destination_as_number"=>0, "ip_next_hop_ipv4_address"=>"192.168.122.1", "class_id"=>0, "exporter"=>{"address"=>"10.1.1.1:49919", "source_id"=>0, "version"=>9, "uptime_millis"=>1310363, "timestamp"=>"2024-05-11T18:08:22.000Z"}, "source_ipv4_prefix_length"=>16, "destination_ipv4_prefix_length"=>0, "tcp_control_bits"=>24, "egress_interface"=>2}, "fileset"=>{"name"=>"log"}, "destination"=>{"ip"=>"34.107.243.93", "port"=>443, "locality"=>"external"}, "@version"=>"1", "agent"=>{"version"=>"8.12.2", "ephemeral_id"=>"59904c33-a69c-4b1f-95e9-e1e6f85ba341", "id"=>"d2d378e8-5a49-446c-9585-15c07c89ada9", "name"=>"adnane-virtual-machine", "type"=>"filebeat"}, "observer"=>{"ip"=>"10.1.1.1"}, "event"=>{"start"=>"2024-05-11T18:07:55.251Z", "duration"=>0, "action"=>"netflow_flow", "category"=>["network"], "dataset"=>"netflow.log", "type"=>["connection"], "module"=>"netflow", "created"=>"2024-05-11T18:08:03.196Z", "kind"=>"event", "end"=>"2024-05-11T18:07:55.251Z"}, "network"=>{"transport"=>"tcp", "packets"=>1, "community_id"=>"1:a9EULhucw+l/TnLJXGCMYCguC20=", "iana_number"=>6, "bytes"=>68, "direction"=>"inbound"}, "ecs"=>{"version"=>"1.12.0"}, "service"=>{"type"=>"netflow"}, "source"=>{"packets"=>1, "ip"=>"10.1.10.254", "port"=>41256, "bytes"=>68, "locality"=>"internal"}}], :response=>{"index"=>{"status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"only write ops with an op_type of create are allowed in data streams", "caused_by"=>{"type"=>"index_not_found_exception", "reason"=>"no such index [netflow-test-2024.05.11]", "resource.type"=>"index_or_alias", "excluded_ds"=>"true", "resource.id"=>"netflow-test-2024.05.11", "index_uuid"=>"_na_", "index"=>"netflow-test-2024.05.11"}}}}}
[2024-05-11T19:08:21,665][WARN ][logstash.outputs.elasticsearch][main][f272e2516d7fddc8f4ce751d6466200f44fa68bed5e6eeee77c3303ab14a15ec] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"netflow-test-2024.05.11", :routing=>nil}, {"input"=>{"type"=>"netflow"}, "tags"=>["forwarded", "beats_input_raw_event"], "related"=>{"ip"=>["10.1.10.254", "151.101.2.217"]}, "@timestamp"=>2024-05-11T18:07:53.000Z, "flow"=>{"id"=>"m8dNDadLIP8", "locality"=>"external"}, "type"=>"netflow", "netflow"=>{"packet_delta_count"=>2, "source_transport_port"=>45976, "flow_start_sys_up_time"=>1255268, "type"=>"netflow_flow", "octet_delta_count"=>126, "destination_transport_port"=>443, "ip_class_of_service"=>0, "sampler_id"=>0, "source_ipv4_address"=>"10.1.10.254", "ingress_interface"=>1, "flow_direction"=>0, "flow_end_sys_up_time"=>1255355, "destination_ipv4_address"=>"151.101.2.217", "protocol_identifier"=>6, "bgp_source_as_number"=>0, "bgp_destination_as_number"=>0, "ip_next_hop_ipv4_address"=>"192.168.122.1", "class_id"=>0, "exporter"=>{"source_id"=>0, "version"=>9, "address"=>"10.1.1.1:49919", "uptime_millis"=>1281364, "timestamp"=>"2024-05-11T18:07:53.000Z"}, "source_ipv4_prefix_length"=>16, "destination_ipv4_prefix_length"=>0, "tcp_control_bits"=>24, "egress_interface"=>2}, "fileset"=>{"name"=>"log"}, "destination"=>{"ip"=>"151.101.2.217", "port"=>443, "locality"=>"external"}, "@version"=>"1", "agent"=>{"version"=>"8.12.2", "ephemeral_id"=>"59904c33-a69c-4b1f-95e9-e1e6f85ba341", "id"=>"d2d378e8-5a49-446c-9585-15c07c89ada9", "name"=>"adnane-virtual-machine", "type"=>"filebeat"}, "observer"=>{"ip"=>"10.1.1.1"}, "event"=>{"start"=>"2024-05-11T18:07:26.904Z", "duration"=>87000000, "action"=>"netflow_flow", "category"=>["network"], "dataset"=>"netflow.log", "type"=>["connection"], "module"=>"netflow", "created"=>"2024-05-11T18:07:34.111Z", "end"=>"2024-05-11T18:07:26.991Z", "kind"=>"event"}, "network"=>{"transport"=>"tcp", "community_id"=>"1:+FFMEg9I6xQi7Yun4KWGH1MOt5I=", "packets"=>2, "iana_number"=>6, "bytes"=>126, "direction"=>"inbound"}, "ecs"=>{"version"=>"1.12.0"}, "service"=>{"type"=>"netflow"}, "source"=>{"ip"=>"10.1.10.254", "packets"=>2, "port"=>45976, "locality"=>"internal", "bytes"=>126}}], :response=>{"index"=>{"status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"only write ops with an op_type of create are allowed in data streams", "caused_by"=>{"type"=>"index_not_found_exception", "reason"=>"no such index [netflow-test-2024.05.11]", "resource.type"=>"index_or_alias", "excluded_ds"=>"true", "resource.id"=>"netflow-test-2024.05.11", "index_uuid"=>"_na_", "index"=>"netflow-test-2024.05.11"}}}}}
[2024-05-11T19:08:21,667][WARN ][logstash.outputs.elasticsearch][main][f272e2516d7fddc8f4ce751d6466200f44fa68bed5e6eeee77c3303ab14a15ec] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"netflow-test-2024.05.11", :routing=>nil}, {"input"=>{"type"=>"netflow"}, "tags"=>["forwarded", "beats_input_raw_event"], "related"=>{"ip"=>["10.1.10.254", "151.101.2.217"]}, "@timestamp"=>2024-05-11T18:08:22.000Z, "flow"=>{"id"=>"m8dNDadLIP8", "locality"=>"external"}, "type"=>"netflow", "netflow"=>{"packet_delta_count"=>5, "source_transport_port"=>45976, "flow_start_sys_up_time"=>1308725, "type"=>"netflow_flow", "octet_delta_count"=>277, "destination_transport_port"=>443, "ip_class_of_service"=>0, "source_ipv4_address"=>"10.1.10.254", "sampler_id"=>0, "ingress_interface"=>1, "flow_direction"=>0, "flow_end_sys_up_time"=>1308807, "destination_ipv4_address"=>"151.101.2.217", "protocol_identifier"=>6, "bgp_source_as_number"=>0, "bgp_destination_as_number"=>0, "ip_next_hop_ipv4_address"=>"192.168.122.1", "class_id"=>0, "exporter"=>{"address"=>"10.1.1.1:49919", "source_id"=>0, "version"=>9, "uptime_millis"=>1310363, "timestamp"=>"2024-05-11T18:08:22.000Z"}, "source_ipv4_prefix_length"=>16, "destination_ipv4_prefix_length"=>0, "tcp_control_bits"=>29, "egress_interface"=>2}, "fileset"=>{"name"=>"log"}, "destination"=>{"ip"=>"151.101.2.217", "port"=>443, "locality"=>"external"}, "@version"=>"1", "agent"=>{"ephemeral_id"=>"59904c33-a69c-4b1f-95e9-e1e6f85ba341", "id"=>"d2d378e8-5a49-446c-9585-15c07c89ada9", "version"=>"8.12.2", "name"=>"adnane-virtual-machine", "type"=>"filebeat"}, "observer"=>{"ip"=>"10.1.1.1"}, "event"=>{"start"=>"2024-05-11T18:08:20.362Z", "duration"=>82000000, "action"=>"netflow_flow", "category"=>["network"], "type"=>["connection"], "dataset"=>"netflow.log", "module"=>"netflow", "created"=>"2024-05-11T18:08:03.196Z", "kind"=>"event", "end"=>"2024-05-11T18:08:20.444Z"}, "network"=>{"transport"=>"tcp", "packets"=>5, "community_id"=>"1:+FFMEg9I6xQi7Yun4KWGH1MOt5I=", "iana_number"=>6, "bytes"=>277, "direction"=>"inbound"}, "ecs"=>{"version"=>"1.12.0"}, "service"=>{"type"=>"netflow"}, "source"=>{"ip"=>"10.1.10.254", "packets"=>5, "port"=>45976, "locality"=>"internal", "bytes"=>277}}], :response=>{"index"=>{"status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"only write ops with an op_type of create are allowed in data streams", "caused_by"=>{"type"=>"index_not_found_exception", "reason"=>"no such index [netflow-test-2024.05.11]", "resource.type"=>"index_or_alias", "excluded_ds"=>"true", "resource.id"=>"netflow-test-2024.05.11", "index_uuid"=>"_na_", "index"=>"netflow-test-2024.05.11"}}}}}

/etc/logstash/conf.d

input {
  beats {
    port => 5044
    type => "netflow"
  }
}

output {
    if [type] == "netflow" {
        elasticsearch {
            hosts => ["0.0.0.0:9200"]
            index => "netflow-test-%{+YYYY.MM.dd}"
        }
    }  
}

I haven't received any data on Discover.

Hello @Rios

I'm still seeing the same error. I don't think localhost is the issue.

Thank you for your response.

Which version are you using? Filebeat 8 and Logstash 8 will per default use data streams, which may require some aditional options in the output.

Try the following.

1 Like

Hello, Thank you @leandrojmp

Great, I can't see [WARN] Failed to index event to Elasticsearch.

adnane@adnane-virtual-machine:~$ sudo tail -n 26 /var/log/logstash/logstash-plain.log
[2024-05-12T03:19:31,741][WARN ][logstash.runner          ] SIGTERM received. Shutting down.
[2024-05-12T03:19:38,322][INFO ][logstash.javapipeline    ][main] Pipeline terminated {"pipeline.id"=>"main"}
[2024-05-12T03:19:38,998][INFO ][logstash.pipelinesregistry] Removed pipeline from registry successfully {:pipeline_id=>:main}
[2024-05-12T03:19:39,008][INFO ][logstash.runner          ] Logstash shut down.
[2024-05-12T03:20:04,278][INFO ][logstash.runner          ] Log4j configuration path used is: /etc/logstash/log4j2.properties
[2024-05-12T03:20:04,289][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"8.13.1", "jruby.version"=>"jruby 9.4.5.0 (3.1.4) 2023-11-02 1abae2700f OpenJDK 64-Bit Server VM 17.0.10+7 on 17.0.10+7 +indy +jit [x86_64-linux]"}
[2024-05-12T03:20:04,293][INFO ][logstash.runner          ] JVM bootstrap flags: [-Xms1g, -Xmx1g, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Dlogstash.jackson.stream-read-constraints.max-string-length=200000000, -Dlogstash.jackson.stream-read-constraints.max-number-length=10000, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED, -Dio.netty.allocator.maxOrder=11]
[2024-05-12T03:20:04,299][INFO ][logstash.runner          ] Jackson default value override `logstash.jackson.stream-read-constraints.max-string-length` configured to `200000000`
[2024-05-12T03:20:04,299][INFO ][logstash.runner          ] Jackson default value override `logstash.jackson.stream-read-constraints.max-number-length` configured to `10000`
[2024-05-12T03:20:05,493][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
[2024-05-12T03:20:05,884][INFO ][org.reflections.Reflections] Reflections took 204 ms to scan 1 urls, producing 132 keys and 468 values
[2024-05-12T03:20:06,331][INFO ][logstash.javapipeline    ] Pipeline `main` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
[2024-05-12T03:20:06,356][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//0.0.0.0:9200"]}
[2024-05-12T03:20:06,511][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://0.0.0.0:9200/]}}
[2024-05-12T03:20:06,666][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://0.0.0.0:9200/"}
[2024-05-12T03:20:06,667][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch version determined (8.13.2) {:es_version=>8}
[2024-05-12T03:20:06,667][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>8}
[2024-05-12T03:20:06,683][INFO ][logstash.outputs.elasticsearch][main] Not eligible for data streams because config contains one or more settings that are not compatible with data streams: {"index"=>"netflow-test-index-%{+YYYY.MM.dd}"}
[2024-05-12T03:20:06,684][INFO ][logstash.outputs.elasticsearch][main] Data streams auto configuration (`data_stream => auto` or unset) resolved to `false`
[2024-05-12T03:20:06,701][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>8, :ecs_compatibility=>:v8}
[2024-05-12T03:20:06,714][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1000, "pipeline.sources"=>["/etc/logstash/conf.d/netflow.conf"], :thread=>"#<Thread:0x63edff14 /usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:134 run>"}
[2024-05-12T03:20:07,560][INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>0.84}
[2024-05-12T03:20:07,566][INFO ][logstash.inputs.beats    ][main] Starting input listener {:address=>"0.0.0.0:5044"}
[2024-05-12T03:20:07,575][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2024-05-12T03:20:07,596][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2024-05-12T03:20:07,678][INFO ][org.logstash.beats.Server][main][49746e342d608dd8f44b8e73d6745c07e83acabcfa6c0a8b0d7f22f521c7e4f6] Starting server on port: 5044

/etc/filebeat/modules.d/netflow.yml


it just so happens that I receive the data with the index I created on Datastreams

I think I am doing well, I do not have any problems anymore.