i tried use port 2055, 2056, 9995
when I launch logstash in command line, i stopped logstash service
now i changed the port and ip adress
- module: netflow
log:
enabled: true
var:
netflow_host: "127.34.4.19"
netflow_port: 2055
netflow_timeout: 300s
but i get the same messages in logs:
service logstash stop
Redirecting to /bin/systemctl stop logstash.service
[root@elk ~]# sudo -u logstash /usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/netflow.conf
Using bundled JDK: /usr/share/logstash/jdk
/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/concurrent-ruby-1.1.9/lib/concurrent-ruby/concurrent/executor/java_thread_pool_executor.rb:13: warning: method redefined; discarding old to_int
/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/concurrent-ruby-1.1.9/lib/concurrent-ruby/concurrent/executor/java_thread_pool_executor.rb:13: warning: method redefined; discarding old to_f
WARNING: Could not find logstash.yml which is typically located in $LS_HOME/config or /etc/logstash. You can specify the path using --path.settings. Continuing using the defaults
Could not find log4j2 configuration at path /usr/share/logstash/config/log4j2.properties. Using default config which logs errors to the console
[INFO ] 2024-03-28 10:48:07.095 [main] runner - Starting Logstash {"logstash.version"=>"8.12.2", "jruby.version"=>"jruby 9.4.5.0 (3.1.4) 2023-11-02 1abae2700f OpenJDK 64-Bit Server VM 17.0.10+7 on 17.0.10+7 +indy +jit [x86_64-linux]"}
[INFO ] 2024-03-28 10:48:07.100 [main] runner - JVM bootstrap flags: [-XX:+HeapDumpOnOutOfMemoryError, -Dlogstash.jackson.stream-read-constraints.max-number-length=10000, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED, -Djruby.regexp.interruptible=true, --add-opens=java.base/java.security=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, -Dio.netty.allocator.maxOrder=11, -Dlog4j2.isThreadContextMapInheritable=true, -Xms1g, -Dlogstash.jackson.stream-read-constraints.max-string-length=200000000, -Djdk.io.File.enableADS=true, -Dfile.encoding=UTF-8, --add-opens=java.base/java.io=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED, -Djruby.compile.invokedynamic=true, -Xmx1g, -Djava.security.egd=file:/dev/urandom, -Djava.awt.headless=true, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED]
[INFO ] 2024-03-28 10:48:07.114 [main] runner - Jackson default value override `logstash.jackson.stream-read-constraints.max-string-length` configured to `200000000`
[INFO ] 2024-03-28 10:48:07.114 [main] runner - Jackson default value override `logstash.jackson.stream-read-constraints.max-number-length` configured to `10000`
[WARN ] 2024-03-28 10:48:08.135 [LogStash::Runner] multilocal - Ignoring the 'pipelines.yml' file because modules or command line options are specified
[INFO ] 2024-03-28 10:48:11.841 [Converge PipelineAction::Create<main>] Reflections - Reflections took 302 ms to scan 1 urls, producing 132 keys and 468 values
[INFO ] 2024-03-28 10:48:11.844 [Api Webserver] agent - Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/bindata-2.4.15/lib/bindata/base.rb:80: warning: previous definition of initialize was here
[INFO ] 2024-03-28 10:48:13.797 [Converge PipelineAction::Create<main>] javapipeline - Pipeline `main` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
[INFO ] 2024-03-28 10:48:13.804 [[main]-pipeline-manager] elasticsearch - New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//10.0.125.7:9200"]}
[INFO ] 2024-03-28 10:48:14.654 [[main]-pipeline-manager] elasticsearch - Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[https://elastic:xxxxxx@10.0.125.7:9200/]}}
[INFO ] 2024-03-28 10:48:14.825 [[main]-pipeline-manager] elasticsearch - Failed to perform request {:message=>"Connect to 10.0.125.7:9200 [/10.0.125.7] failed: Connection refused", :exception=>Manticore::SocketException, :cause=>#<Java::OrgApacheHttpConn::HttpHostConnectException: Connect to 10.0.125.7:9200 [/10.0.125.7] failed: Connection refused>}
[WARN ] 2024-03-28 10:48:14.825 [[main]-pipeline-manager] elasticsearch - Attempted to resurrect connection to dead ES instance, but got an error {:url=>"https://elastic:xxxxxx@10.0.125.7:9200/", :exception=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError, :message=>"Elasticsearch Unreachable: [https://10.0.125.7:9200/][Manticore::SocketException] Connect to 10.0.125.7:9200 [/10.0.125.7] failed: Connection refused"}
[INFO ] 2024-03-28 10:48:14.832 [[main]-pipeline-manager] elasticsearch - Not eligible for data streams because config contains one or more settings that are not compatible with data streams: {"index"=>"netflow-%{+YYYY.MM.dd}"}
[INFO ] 2024-03-28 10:48:14.832 [[main]-pipeline-manager] elasticsearch - Data streams auto configuration (`data_stream => auto` or unset) resolved to `false`
[INFO ] 2024-03-28 10:48:14.842 [[main]-pipeline-manager] javapipeline - Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1000, "pipeline.sources"=>["/etc/logstash/conf.d/netflow.conf"], :thread=>"#<Thread:0x34265544 /usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:134 run>"}
[INFO ] 2024-03-28 10:48:18.457 [[main]-pipeline-manager] javapipeline - Pipeline Java execution initialization time {"seconds"=>3.61}
[INFO ] 2024-03-28 10:48:18.460 [[main]-pipeline-manager] javapipeline - Pipeline started {"pipeline.id"=>"main"}
[INFO ] 2024-03-28 10:48:18.465 [[main]<udp] udp - Starting UDP listener {:address=>"0.0.0.0:2055"}
[ERROR] 2024-03-28 10:48:18.489 [[main]<udp] udp - UDP listener died {:exception=>#<Errno::EADDRINUSE: Address already in use - bind(2) for "0.0.0.0" port 2055>, :backtrace=>["org/jruby/ext/socket/RubyUDPSocket.java:201:in `bind'", "/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/logstash-input-udp-3.5.0/lib/logstash/inputs/udp.rb:129:in `udp_listener'", "/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/logstash-input-udp-3.5.0/lib/logstash/inputs/udp.rb:81:in `run'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:414:in `inputworker'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:405:in `block in start_input'"]}
[INFO ] 2024-03-28 10:48:18.542 [Agent thread] agent - Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[INFO ] 2024-03-28 10:48:19.832 [Ruby-0-Thread-9: /usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/logstash-output-elasticsearch-11.22.2-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:235] elasticsearch - Failed to perform request {:message=>"Connect to 10.0.125.7:9200 [/10.0.125.7] failed: Connection refused", :exception=>Manticore::SocketException, :cause=>#<Java::OrgApacheHttpConn::HttpHostConnectException: Connect to 10.0.125.7:9200 [/10.0.125.7] failed: Connection refused>}