I have ELK in my podman-compose setup and I can push any logs I get to Logstash through to Elasticsearch just fine. It is when I am trying to also send to syslog for those that already have a process to pull all syslog data that my setup is just not working. I never get anything written to syslog. And the plugin is there when I run the plugin listing command for logstash.
I separated out the configuration below and I never get anything to syslog at all on my REL 8.7 machine. If I use the regular "logger xxxxx" type of command any message goes to syslog fine from the terminal window. It is not blocked on the TCP port via firewalld and the syslog conf file already listens out for that port.
I have tried 514 in just a number as well as wrapped with "" and no go. I am at a loss what I am doing wrong these past 5 hours. So reaching out here if anyone has any suggestions, nuances, etc. The "192.168.x.x" is the host IP to capture the logs that is running rsyslog.
input {
http {
port => "5000"
}
}
output {
syslog {
host => "192.168.13.xxx"
port => 514
protocol => "tcp"
ssl_verify => "false"
}
}
Output after I stand up my compose YML with all the pieces is below for the logstash container running:
Sending Logstash logs to /usr/share/logstash/logs which is now configured via log4j2.properties
[2022-12-20T21:55:43,359][INFO ][logstash.runner ] Log4j configuration path used is: /usr/share/logstash/config/log4j2.properties
[2022-12-20T21:55:43,385][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"8.5.3", "jruby.version"=>"jruby 9.3.9.0 (2.6.8) 2022-10-24 537cd1f8bc OpenJDK 64-Bit Server VM 17.0.5+8 on 17.0.5+8 +indy +jit [x86_64-linux]"}
[2022-12-20T21:55:43,391][INFO ][logstash.runner ] JVM bootstrap flags: [-Xms1g, -Xmx1g, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -Djruby.jit.threshold=0, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Dls.cgroup.cpuacct.path.override=/, -Dls.cgroup.cpu.path.override=/, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED]
[2022-12-20T21:55:43,462][INFO ][logstash.settings ] Creating directory {:setting=>"path.queue", :path=>"/usr/share/logstash/data/queue"}
[2022-12-20T21:55:43,496][INFO ][logstash.settings ] Creating directory {:setting=>"path.dead_letter_queue", :path=>"/usr/share/logstash/data/dead_letter_queue"}
[2022-12-20T21:55:43,953][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2022-12-20T21:55:44,005][INFO ][logstash.agent ] No persistent UUID file found. Generating new UUID {:uuid=>"752f6716-0499-486e-8e18-d4de20639068", :path=>"/usr/share/logstash/data/uuid"}
[2022-12-20T21:55:46,039][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
[2022-12-20T21:55:46,951][INFO ][org.reflections.Reflections] Reflections took 150 ms to scan 1 urls, producing 125 keys and 438 values
[2022-12-20T21:55:47,665][INFO ][logstash.javapipeline ] Pipeline `main` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
[2022-12-20T21:55:47,889][INFO ][logstash.javapipeline ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1000, "pipeline.sources"=>["/usr/share/logstash/config/logstash.conf"], :thread=>"#<Thread:0x581cbf5e run>"}
[2022-12-20T21:55:48,670][INFO ][logstash.javapipeline ][main] Pipeline Java execution initialization time {"seconds"=>0.77}
[2022-12-20T21:55:48,768][INFO ][logstash.codecs.json ][main] ECS compatibility is enabled but `target` option was not specified. This may cause fields to be set at the top-level of the event where they are likely to clash with the Elastic Common Schema. It is recommended to set the `target` option to avoid potential schema conflicts (if your data is ECS compliant or non-conflicting, feel free to ignore this message)
[2022-12-20T21:55:48,934][INFO ][logstash.javapipeline ][main] Pipeline started {"pipeline.id"=>"main"}
[2022-12-20T21:55:48,952][INFO ][logstash.inputs.http ][main][a762021a90dbee39a30ff192cc8a3d1d076e61fe7a6b267a6ccd6945a3f50433] Starting http input listener {:address=>"0.0.0.0:5000", :ssl=>"false"}
[2022-12-20T21:55:49,055][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}