Logstash not sending syslog to elasticsearch

Hello Elastic team.

I have been asking tons of questions and have been slowly making my way through the final part of my ELK SIEM installation. I am running a small environment with Elasticsearch, Logstash and Kibana installed on seperate RHEL servers. I have opened the ports so that each system can talk to one another and send information. Where I am stuck at the moment, and have been for some time is getting syslogs to be passed to Elasticsearch so they can be viewed and assessed in Kibana.

My network team has formatted the internal firewall and an ntp server to send their syslogs directly to the Logstash server over UDP port 514. I have forwarded that 514 port to a higher port as I am a SUDO user and not ROOT. I have tested using a tcpdump -n -vv host HOST_IP_ADD and I am seeing that the syslogs are being sent and received on my Logstash server.

If I go to either my Elasticsearch or Kibana server and run an echo 'hello world' | nc Logstash_IP port I can see the hello world in my Kibana under the syslog index I created in my logstash.conf file. Below is my logstash.conf file, which is located in the /etc/logstash directory.

input {
    udp {
        port => port#
        type => syslog
        }
   tcp {
      port => port#
      type => syslog
     }
}

output {
     elasticsearch {
          hosts => ["es_ip_addresss:port"]
          user => "elastic_user"
          password => "elastic_password"]
          index => "syslog"-%{+YYYY.MM.dd}"
    }
}

If I run a test of my logstash.conf file from /usr/share/logstash and running
bin/logstash --config.test_and_exit -f /etc/logstash/logstash.conf: It spits out that it is OK.

I am not sure what else I can do to make sure that it will do what I intend it to do. Which is read the syslogs that it is being sent and send them to Elasticsearch and the I can access them in kibana.

Also if I look in Kibana under Security --> Hosts --> filter down to source.ip and the ip of the ntp server I see that traffic is flowing into my Elasticsearch via the use of packetbeat.

Thank you for any and all assistance you can provide.

I have read through a lot of the other peoples questions regarding logstash.conf as well and I notice that when they add their output they tend to use Localhost:9200 instead of the Elasticsearch IP address. Is that correct?

As I am running seperate systems for the three layers of ELK, would I not want to specify the output to Elasticsearch?

Would this be correct?

output {
     elasticsearch {
            hosts => ["localhost:9200"]
            password => ["ES_Password"]
            index => 'syslog"-%{+YYYY.MM.dd}"
   }
}

If you change your output to stdout do you see messages on your console?

output {
 stdout {}
}

Also is this a typo or how you have it?

index => 'syslog"-%{+YYYY.MM.dd}"

Should be

index => "syslog-%{+YYYY.MM.dd}"

disregard that last comment. It errors out when trying to connect to the local host. I am now replacing it with 0.0.0.0:9200 and trying that.

Hello Aaron;

I did not see anything. However, I did not have a my stdout configured with empty curly brackets. I will try that

index => 'syslog"-%{+YYYY.MM.dd}" was a typo. I do have double " around my syslog output string.

I am going to try the stdout {} and let you know.

Is this a typo also?

password => "elastic_password"]

Should be no ] at the end.

Yes just a typo. I am unable to just copy and paste, so I need to retype everything.

My Logstash server crashed. Getting the owner of the server to get it back up and running. It was after adding the hosts => ["localhost:9200"]

Aaron;

when the Logstash server is operating properly again, and I make that change to my output in the logstash.conf file, what would I be looking for in terms of messages?

Logstash is up and running. the current logstash.conf output looks like the following:

output {
    stdout {}
}

should I see anything? If so what would it be, generically speaking?

You would be looking for an output straight from your input.

The reason behind doing this is right now you have an input and an output. By what you are saying I can't determine which is the problem. Setting the output to stdout and if you see messages then we know the input is good and can move to the output.

So when I changed the output to stdout I did not see any messages. Everything was the same as before.

Could be that if you are not running Logstash as root you don't have access to that port.

Ports in range 1 to 1024 are privileged and only root user can listen on it.

That is correct. And as I mentioned in my original post I forwarded the syslog port 514 to a port above 1024.

I have reverted my logstash.conf file to the other output. directing to the Elasticsearch IP, with the user and password.

The issue is still with your input and I would focus on that until you see messages using stdout. When you start Logstash you should see all the info, warn, error messages. Can you post that? That should contain a clue on what's going wrong.

I will post that.

As an aside, I just sent a hello world from my kibana server to the Logstash server on the udp syslog port and it went through and I see it in my Kibana console under the Discover panel.

Aaron. See below for output

[2021-11-12T11:06:39,435][DEBUG][logstash.codecs.rubydebug] config LogStash::Codecs::RubyDebug/@id = "rubydebug_9b24f5a3-92c3-49a7-b7f7-d03071dcc1a0"
[2021-11-12T11:06:39,435][DEBUG][logstash.codecs.rubydebug] config LogStash::Codecs::RubyDebug/@enable_metric = true
[2021-11-12T11:06:39,435][DEBUG][logstash.codecs.rubydebug] config LogStash::Codecs::RubyDebug/@metadata = false
[2021-11-12T11:06:39,499][DEBUG][logstash.outputs.stdout  ] config LogStash::Outputs::Stdout/@id = "9623644cc67197977b3d2db2b5760f6c3eb65cf162ee2ea0d3cbea04090798e5"
[2021-11-12T11:06:39,500][DEBUG][logstash.outputs.stdout  ] config LogStash::Outputs::Stdout/@enable_metric = true
[2021-11-12T11:06:39,501][DEBUG][logstash.outputs.stdout  ] config LogStash::Outputs::Stdout/@codec = <LogStash::Codecs::RubyDebug id=>"rubydebug_9b24f5a3-92c3-49a7-b7f7-d03071dcc1a0", enable_metric=>true, metadata=>false>
[2021-11-12T11:06:39,501][DEBUG][logstash.outputs.stdout  ] config LogStash::Outputs::Stdout/@workers = 1
[2021-11-12T11:06:39,531][DEBUG][logstash.javapipeline    ] Starting pipeline {:pipeline_id=>"main"}
[2021-11-12T11:06:39,578][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, "pipeline.sources"=>["/etc/logstash/logstash-sample.conf"], :thread=>"#<Thread:0x563854f9 run>"}
[2021-11-12T11:06:39,982][DEBUG][org.logstash.config.ir.CompiledPipeline][main] Compiled output
 P[output-stdout{}|[file]/etc/logstash/logstash-sample.conf:37:5:```
stdout {}
```]
 into
 org.logstash.config.ir.compiler.ComputeStepSyntaxElement@3fdc2b9c
[2021-11-12T11:06:39,983][DEBUG][org.logstash.config.ir.CompiledPipeline][main] Compiled output
 P[output-stdout{}|[file]/etc/logstash/logstash-sample.conf:37:5:```
stdout {}
```]
 into
 org.logstash.config.ir.compiler.ComputeStepSyntaxElement@3fdc2b9c
[2021-11-12T11:06:39,986][DEBUG][org.logstash.config.ir.CompiledPipeline][main] Compiled output
 P[output-stdout{}|[file]/etc/logstash/logstash-sample.conf:37:5:```
stdout {}
```]
 into
 org.logstash.config.ir.compiler.ComputeStepSyntaxElement@3fdc2b9c
[2021-11-12T11:06:39,988][DEBUG][org.logstash.config.ir.CompiledPipeline][main] Compiled output
 P[output-stdout{}|[file]/etc/logstash/logstash-sample.conf:37:5:```
stdout {}
```]
 into
 org.logstash.config.ir.compiler.ComputeStepSyntaxElement@3fdc2b9c
[2021-11-12T11:06:40,067][INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>0.49}
[2021-11-12T11:06:40,120][DEBUG][io.netty.util.internal.logging.InternalLoggerFactory][main] Using SLF4J as the default logging framework
[2021-11-12T11:06:40,122][DEBUG][io.netty.channel.MultithreadEventLoopGroup][main] -Dio.netty.eventLoopThreads: 8
[2021-11-12T11:06:40,133][DEBUG][io.netty.util.internal.InternalThreadLocalMap][main] -Dio.netty.threadLocalMap.stringBuilder.initialSize: 1024
[2021-11-12T11:06:40,133][DEBUG][io.netty.util.internal.InternalThreadLocalMap][main] -Dio.netty.threadLocalMap.stringBuilder.maxSize: 4096
[2021-11-12T11:06:40,138][DEBUG][io.netty.channel.nio.NioEventLoop][main] -Dio.netty.noKeySetOptimization: false
[2021-11-12T11:06:40,138][DEBUG][io.netty.channel.nio.NioEventLoop][main] -Dio.netty.selectorAutoRebuildThreshold: 512
[2021-11-12T11:06:40,149][DEBUG][io.netty.util.internal.PlatformDependent0][main] -Dio.netty.noUnsafe: false
[2021-11-12T11:06:40,150][DEBUG][io.netty.util.internal.PlatformDependent0][main] Java version: 11
[2021-11-12T11:06:40,152][DEBUG][io.netty.util.internal.PlatformDependent0][main] sun.misc.Unsafe.theUnsafe: available
[2021-11-12T11:06:40,152][DEBUG][io.netty.util.internal.PlatformDependent0][main] sun.misc.Unsafe.copyMemory: available
[2021-11-12T11:06:40,153][DEBUG][io.netty.util.internal.PlatformDependent0][main] java.nio.Buffer.address: available
[2021-11-12T11:06:40,153][DEBUG][io.netty.util.internal.PlatformDependent0][main] direct buffer constructor: unavailable
java.lang.UnsupportedOperationException: Reflective setAccessible(true) disabled
        at io.netty.util.internal.ReflectionUtil.trySetAccessible(ReflectionUtil.java:31) ~[logstash-input-tcp-6.0.10.jar:?]
        at io.netty.util.internal.PlatformDependent0$4.run(PlatformDependent0.java:233) ~[logstash-input-tcp-6.0.10.jar:?]
        at java.security.AccessController.doPrivileged(Native Method) ~[?:?]
        at io.netty.util.internal.PlatformDependent0.<clinit>(PlatformDependent0.java:227) [logstash-input-tcp-6.0.10.jar:?]
        at io.netty.util.internal.PlatformDependent.isAndroid(PlatformDependent.java:289) [logstash-input-tcp-6.0.10.jar:?]
        at io.netty.util.internal.PlatformDependent.<clinit>(PlatformDependent.java:92) [logstash-input-tcp-6.0.10.jar:?]
        at io.netty.channel.nio.NioEventLoop.newTaskQueue0(NioEventLoop.java:279) [logstash-input-tcp-6.0.10.jar:?]
        at io.netty.channel.nio.NioEventLoop.newTaskQueue(NioEventLoop.java:150) [logstash-input-tcp-6.0.10.jar:?]
        at io.netty.channel.nio.NioEventLoop.<init>(NioEventLoop.java:138) [logstash-input-tcp-6.0.10.jar:?]
        at io.netty.channel.nio.NioEventLoopGroup.newChild(NioEventLoopGroup.java:146) [logstash-input-tcp-6.0.10.jar:?]
        at io.netty.channel.nio.NioEventLoopGroup.newChild(NioEventLoopGroup.java:37) [logstash-input-tcp-6.0.10.jar:?]
        at io.netty.util.concurrent.MultithreadEventExecutorGroup.<init>(MultithreadEventExecutorGroup.java:84) [logstash-input-tcp-6.0.10.jar:?]

And More snippets of log after the change in the output.

 at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_input_minus_tcp_minus_6_dot_0_dot_10_minus_java.lib.logstash.inputs.tcp.RUBY$method$register$0(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-tcp-6.0.10-java/lib/logstash/inputs/tcp.rb:154) [jruby-complete-9.2.16.0.jar:?]
        at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_input_minus_tcp_minus_6_dot_0_dot_10_minus_java.lib.logstash.inputs.tcp.RUBY$method$register$0$__VARARGS__(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-tcp-6.0.10-java/lib/logstash/inputs/tcp.rb) [jruby-complete-9.2.16.0.jar:?]
        at org.jruby.internal.runtime.methods.CompiledIRMethod.call(CompiledIRMethod.java:80) [jruby-complete-9.2.16.0.jar:?]
        at org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:70) [jruby-complete-9.2.16.0.jar:?]
        at org.jruby.ir.targets.InvokeSite.invoke(InvokeSite.java:207) [jruby-complete-9.2.16.0.jar:?]
        at usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.RUBY$block$register_plugins$1(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:228) [jruby-complete-9.2.16.0.jar:?]
        at org.jruby.runtime.CompiledIRBlockBody.yieldDirect(CompiledIRBlockBody.java:148) [jruby-complete-9.2.16.0.jar:?]
        at org.jruby.runtime.BlockBody.yield(BlockBody.java:106) [jruby-complete-9.2.16.0.jar:?]
        at org.jruby.runtime.Block.yield(Block.java:184) [jruby-complete-9.2.16.0.jar:?]
        at org.jruby.RubyArray.each(RubyArray.java:1809) [jruby-complete-9.2.16.0.jar:?]
        at usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.RUBY$method$register_plugins$0(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:227) [jruby-complete-9.2.16.0.jar:?]
        at usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.RUBY$method$register_plugins$0$__VARARGS__(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb) [jruby-complete-9.2.16.0.jar:?]
        at org.jruby.internal.runtime.methods.CompiledIRMethod.call(CompiledIRMethod.java:80) [jruby-complete-9.2.16.0.jar:?]
        at org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:70) [jruby-complete-9.2.16.0.jar:?]
        at org.jruby.ir.targets.InvokeSite.invoke(InvokeSite.java:207) [jruby-complete-9.2.16.0.jar:?]
        at usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.RUBY$method$start_inputs$0(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:386) [jruby-complete-9.2.16.0.jar:?]
        at usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.RUBY$method$start_inputs$0$__VARARGS__(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb) [jruby-complete-9.2.16.0.jar:?]
        at org.jruby.internal.runtime.methods.CompiledIRMethod.call(CompiledIRMethod.java:80) [jruby-complete-9.2.16.0.jar:?]
        at org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:70) [jruby-complete-9.2.16.0.jar:?]
        at org.jruby.ir.targets.InvokeSite.invoke(InvokeSite.java:207) [jruby-complete-9.2.16.0.jar:?]
        at usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.RUBY$method$start_workers$0(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:311) [jruby-complete-9.2.16.0.jar:?]
        at usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.RUBY$method$start_workers$0$__VARARGS__(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb) [jruby-complete-9.2.16.0.jar:?]
        at org.jruby.internal.runtime.methods.CompiledIRMethod.call(CompiledIRMethod.java:80) [jruby-complete-9.2.16.0.jar:?]

Aaron;
final snippet from the end of the logstash-plain.log

[2021-11-12T11:06:40,272][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2021-11-12T11:06:40,275][DEBUG][logstash.codecs.plain    ][main][f044f0a6df1315f16877c38bc38258d7ef1f3f8c664abc1c264c9ab18f7904fa] config LogStash::Codecs::Plain/@id = "plain_932e7631-af6e-493c-89f9-6a018104015f"
[2021-11-12T11:06:40,275][DEBUG][logstash.codecs.plain    ][main][f044f0a6df1315f16877c38bc38258d7ef1f3f8c664abc1c264c9ab18f7904fa] config LogStash::Codecs::Plain/@enable_metric = true
[2021-11-12T11:06:40,275][DEBUG][logstash.codecs.plain    ][main][f044f0a6df1315f16877c38bc38258d7ef1f3f8c664abc1c264c9ab18f7904fa] config LogStash::Codecs::Plain/@charset = "UTF-8"
[2021-11-12T11:06:40,287][DEBUG][logstash.inputs.udp      ][main][f044f0a6df1315f16877c38bc38258d7ef1f3f8c664abc1c264c9ab18f7904fa] Starting UDP worker thread {:worker=>2}
[2021-11-12T11:06:40,289][DEBUG][logstash.codecs.plain    ][main][f044f0a6df1315f16877c38bc38258d7ef1f3f8c664abc1c264c9ab18f7904fa] config LogStash::Codecs::Plain/@id = "plain_932e7631-af6e-493c-89f9-6a018104015f"
[2021-11-12T11:06:40,290][DEBUG][logstash.codecs.plain    ][main][f044f0a6df1315f16877c38bc38258d7ef1f3f8c664abc1c264c9ab18f7904fa] config LogStash::Codecs::Plain/@enable_metric = true
[2021-11-12T11:06:40,290][DEBUG][logstash.codecs.plain    ][main][f044f0a6df1315f16877c38bc38258d7ef1f3f8c664abc1c264c9ab18f7904fa] config LogStash::Codecs::Plain/@charset = "UTF-8"
[2021-11-12T11:06:40,300][INFO ][logstash.inputs.udp      ][main][f044f0a6df1315f16877c38bc38258d7ef1f3f8c664abc1c264c9ab18f7904fa] Starting UDP listener {:address=>"0.0.0.0:5144"}
[2021-11-12T11:06:40,330][INFO ][logstash.inputs.udp      ][main][f044f0a6df1315f16877c38bc38258d7ef1f3f8c664abc1c264c9ab18f7904fa] UDP listener started {:address=>"0.0.0.0:5144", :receive_buffer_bytes=>"106496", :queue_size=>"2000"}
[2021-11-12T11:06:43,203][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2021-11-12T11:06:43,203][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2021-11-12T11:06:45,211][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2021-11-12T11:06:48,209][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2021-11-12T11:06:48,210][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2021-11-12T11:06:50,211][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2021-11-12T11:06:53,214][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2021-11-12T11:06:53,215][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2021-11-12T11:06:55,211][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2021-11-12T11:06:58,219][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2021-11-12T11:06:58,220][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2021-11-12T11:07:00,211][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2021-11-12T11:07:03,224][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2021-11-12T11:07:03,224][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}