Logstash stopped processing

Hi,

The elasticsearch and kibana are working as intentional but after installing and configuring logstash in centos , the service stops

[2022-11-23T15:20:38,842][INFO ][logstash.runner ] Log4j configuration path used is: /etc/logstash/log4j2.properties [2022-11-23T15:20:38,852][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"8.5.1", "jruby.version"=>"jruby 9.3.8.0 (2.6.8) 2022-09-13 98d69c9461 OpenJDK 64-Bit Server VM 17.0.5+8 on 17.0.5+8 +indy +jit [x86_64-linux]"} [2022-11-23T15:20:38,865][INFO ][logstash.runner ] JVM bootstrap flags: [-Xms1g, -Xmx1g, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -Djruby.jit.threshold=0, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED] [2022-11-23T15:20:38,881][DEBUG][logstash.modules.scaffold] Found module {:module_name=>"fb_apache", :directory=>"/usr/share/logstash/modules/fb_apache/configuration"} [2022-11-23T15:20:38,882][DEBUG][logstash.plugins.registry] Adding plugin to the registry {:name=>"fb_apache", :type=>:modules, :class=>#<LogStash::Modules::Scaffold:0x7536dc3b @directory="/usr/share/logstash/modules/fb_apache/configuration", @module_name="fb_apache", @kibana_version_parts=["6", "0", "0"]>} [2022-11-23T15:20:38,886][DEBUG][logstash.modules.scaffold] Found module {:module_name=>"netflow", :directory=>"/usr/share/logstash/modules/netflow/configuration"} [2022-11-23T15:20:38,887][DEBUG][logstash.plugins.registry] Adding plugin to the registry {:name=>"netflow", :type=>:modules, :class=>#<LogStash::Modules::Scaffold:0x5e4b92b3 @directory="/usr/share/logstash/modules/netflow/configuration", @module_name="netflow", @kibana_version_parts=["6", "0", "0"]>} [2022-11-23T15:20:38,912][FATAL][org.logstash.Logstash ] Logstash stopped processing because of an error: (SystemExit) exit org.jruby.exceptions.SystemExit: (SystemExit) exit at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:790) ~[jruby.jar:?] at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:753) ~[jruby.jar:?] at usr.share.logstash.lib.bootstrap.environment.<main>(/usr/share/logstash/lib/bootstrap/environment.rb:91) ~[?:?]

Most likely not enough memory. Increase to 2 or 4 GB and try again.

same error occurred even after increasing size from Xms1g / Xmx1g to Xms4g Xmx4g

Can you check:

  • Is another LS instance running?
    ps -ef | grep logstash
  • Is there anything in journal? Restart LS and check journal
    journalctl -u logstash.service --since "30min ago"
  • Is there more messages in logstash-plain.log?

What is your configuration? You need to share your configuration.

Also, check the system log, /var/log/messages for hints why logstash stops after you start the service.

only 1 logstash is running

ps -ef | grep logstash

logstash 6033 1 1 18:46 ? 00:00:00 /usr/share/logstash/jdk/bin/java -Xms4g -Xmx4g -Djava.awt.headless=true -Dfile.encoding=UTF-8 -Djruby.compile.invokedynamic=true -Djruby.jit.threshold=0 -XX:+HeapDumpOnOutOfMemoryError -Djava.security.egd=file:/dev/urandom -Dlog4j2.isThreadContextMapInheritable=true -Djruby.regexp.interruptible=true -Djdk.io.File.enableADS=true --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED --add-opens=java.base/java.security=ALL-UNNAMED --add-opens=java.base/java.io=ALL-UNNAMED --add-opens=java.base/java.nio.channels=ALL-UNNAMED --add-opens=java.base/sun.nio.ch=ALL-UNNAMED --add-opens=java.management/sun.management=ALL-UNNAMED -cp /usr/share/logstash/vendor/jruby/lib/jruby.jar:/usr/share/logstash/logstash-core/lib/jars/checker-qual-3.12.0.jar:/usr/share/logstash/logstash-core/lib/jars/commons-codec-1.15.jar:/usr/share/logstash/logstash-core/lib/jars/commons-compiler-3.1.0.jar:/usr/share/logstash/logstash-core/lib/jars/commons-logging-1.2.jar:/usr/share/logstash/logstash-core/lib/jars/error_prone_annotations-2.11.0.jar:/usr/share/logstash/logstash-core/lib/jars/failureaccess-1.0.1.jar:/usr/share/logstash/logstash-core/lib/jars/google-java-format-1.15.0.jar:/usr/share/logstash/logstash-core/lib/jars/guava-31.1-jre.jar:/usr/share/logstash/logstash-core/lib/jars/httpclient-4.5.13.jar:/usr/share/logstash/logstash-core/lib/jars/httpcore-4.4.14.jar:/usr/share/logstash/logstash-core/lib/jars/j2objc-annotations-1.3.jar:/usr/share/logstash/logstash-core/lib/jars/jackson-annotations-2.13.3.jar:/usr/share/logstash/logstash-core/lib/jars/jackson-core-2.13.3.jar:/usr/share/logstash/logstash-core/lib/jars/jackson-databind-2.13.3.jar:/usr/share/logstash/logstash-core/lib/jars/jackson-dataformat-cbor-2.13.3.jar:/usr/share/logstash/logstash-core/lib/jars/jackson-dataformat-yaml-2.13.3.jar:/usr/share/logstash/logstash-core/lib/jars/janino-3.1.0.jar:/usr/share/logstash/logstash-core/lib/jars/javassist-3.29.0-GA.jar:/usr/share/logstash/logstash-core/lib/jars/jsr305-3.0.2.jar:/usr/share/logstash/logstash-core/lib/jars/jvm-options-parser-8.5.1.jar:/usr/share/logstash/logstash-core/lib/jars/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/usr/share/logstash/logstash-core/lib/jars/log4j-1.2-api-2.17.1.jar:/usr/share/logstash/logstash-core/lib/jars/log4j-api-2.17.1.jar:/usr/share/logstash/logstash-core/lib/jars/log4j-core-2.17.1.jar:/usr/share/logstash/logstash-core/lib/jars/log4j-jcl-2.17.1.jar:/usr/share/logstash/logstash-core/lib/jars/log4j-slf4j-impl-2.17.1.jar:/usr/share/logstash/logstash-core/lib/jars/logstash-core.jar:/usr/share/logstash/logstash-core/lib/jars/reflections-0.10.2.jar:/usr/share/logstash/logstash-core/lib/jars/slf4j-api-1.7.32.jar:/usr/share/logstash/logstash-core/lib/jars/snakeyaml-1.30.jar org.logstash.Logstash --path.settings /etc/logstash
root 6086 22656 0 18:46 pts/1 00:00:00 grep --color=auto logstash

less /etc/logstash/logstash.yml

path.data: /var/lib/logstash
log.level: debug
path.logs: /var/log/logstash
output.elasticsearch:
hosts: ["https://elk.myvfirst.com:9200"]
enabled: true
protocol: "https"
username: "elastic"
password: "xxxxxx"
ssl.enabled: true
ssl.certificate_authorities: "/etc/apm-server/ca.crt"

==============================================

less /etc/logstash/pipeline.yml

  • pipeline.id: main
    path.config: "/etc/logstash/conf.d/filebeat-pipeline.conf"

===================================================
less /etc/logstash/conf.d/filebeat-pipeline.conf

input {
beats {
port => "5044"
host => "0.0.0.0"
ssl => true
ssl_certificate_authorities => ["/etc/kibana/ca.crt"]
}
}
filter {
}
output {
elasticsearch {
hosts => ["https://elk.myvfirst.com:9200"]
user => "elastic"
password => "xxxxxxx"
data_stream => true
ssl_certificate_authorities => ["/etc/kibana/ca.crt"]
}
}

Please remove from logstash.yml

output.elasticsearch:
hosts: ["https://elk.myvfirst.com:9200"]
enabled: true
protocol: "https"
username: "elastic"
password: "xxxxxx"
ssl.enabled: true
ssl.certificate_authorities: "/etc/apm-server/ca.crt"

Can you disable your filebeat-pipeline.conf and run simple test.conf:

input {
  generator {
       "message" => "Test msg;"
       count => 1
  }
 
} # input

filter {

    mutate { add_field => { "ls-ecs" => "%{[host][name]}" } }

}

output {
    stdout {
        codec => rubydebug{ metadata => true}
    }

}

thanks for your help,removing output.elastisearch: .... from logstash.yml did the trick Logstash in running now but can you further help me with filebeat i think its not configured correctly

refer to configuration files
less /etc/filebeat/filebeat.yml

filebeat.inputs:
- type: log
  id: my-filestream-id
  enabled: true
  paths:
    - /opt/elk/log/elasticsearch/*.log

filebeat.config.modules:
  path: ${path.config}/modules.d/*.yml
  reload.enabled: false

setup.template.settings:
  index.number_of_shards: 1

processors:
  - add_host_metadata:
      when.not.contains.tags: forwarded
  - add_cloud_metadata: ~
  - add_docker_metadata: ~
  - add_kubernetes_metadata: ~

logging.level: debug
logging.to_files: true
logging.files:
  path: /var/log/filebeat
  name: filebeat
  keepfiles: 7
  permissions: 0640

output.logstash:
  hosts: ["localhost:5044"]
  type: log
  enabled: true
  paths: /opt/elk/log/elasticsearch/*.json
  worker: 1
setup.kibana:
  hosts: ["localhost:5601"]

i have enables logstash using filebeat modules enable logstash

less {filebeat_home}/logstash/module.yml

dashboards:
- id: Filebeat-Logstash-Log-Dashboard
  file: Filebeat-logstash-log.json
- id: Filebeat-Logstash-Slowlog-Dashboard
  file: Filebeat-logstash-slowlog.json

- module: logstash
  log:
    enabled: true
    var.paths: ["/var/log/logstash/logstash.log*"]
  slowlog:
    enabled: true
    var.paths: ["/var/log/logstash/logstash-slowlog.log*"]

systemctl status filebeat

● filebeat.service - Filebeat sends log files to Logstash or directly to Elasticsearch.
   Loaded: loaded (/usr/lib/systemd/system/filebeat.service; disabled; vendor preset: disabled)
   Active: failed (Result: start-limit) since Thu 2022-11-24 12:56:08 IST; 18s ago
     Docs: https://www.elastic.co/beats/filebeat
  Process: 26855 ExecStart=/usr/share/filebeat/bin/filebeat --environment systemd $BEAT_LOG_OPTS $BEAT_CONFIG_OPTS $BEAT_PATH_OPTS (code=exited, status=1/FAILURE)
 Main PID: 26855 (code=exited, status=1/FAILURE)

Nov 24 12:56:08 vl070073-app2-pd-a15-sms-aws-test-mum-in-vf.vfirst.local systemd[1]: filebeat.service: main process exited, code=exited, status=1/FAILURE
Nov 24 12:56:08 vl070073-app2-pd-a15-sms-aws-test-mum-in-vf.vfirst.local systemd[1]: Unit filebeat.service entered failed state.
Nov 24 12:56:08 vl070073-app2-pd-a15-sms-aws-test-mum-in-vf.vfirst.local systemd[1]: filebeat.service failed.
Nov 24 12:56:08 vl070073-app2-pd-a15-sms-aws-test-mum-in-vf.vfirst.local systemd[1]: filebeat.service holdoff time over, scheduling restart.
Nov 24 12:56:08 vl070073-app2-pd-a15-sms-aws-test-mum-in-vf.vfirst.local systemd[1]: Stopped Filebeat sends log files to Logstash or directly to Elasticsearch..
Nov 24 12:56:08 vl070073-app2-pd-a15-sms-aws-test-mum-in-vf.vfirst.local systemd[1]: start request repeated too quickly for filebeat.service
Nov 24 12:56:08 vl070073-app2-pd-a15-sms-aws-test-mum-in-vf.vfirst.local systemd[1]: Failed to start Filebeat sends log files to Logstash or directly to Elasticsearch..
Nov 24 12:56:08 vl070073-app2-pd-a15-sms-aws-test-mum-in-vf.vfirst.local systemd[1]: Unit filebeat.service entered failed state.
Nov 24 12:56:08 vl070073-app2-pd-a15-sms-aws-test-mum-in-vf.vfirst.local systemd[1]: filebeat.service failed.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.