Kibana not give logs

Configured basic ELK set up.But my kibana interface had no logs.
This is the guide I followed.
What is the mistake I have done.

Installing and Configuring Elasticsearch

curl -fsSL https://artifacts.elastic.co/GPG-KEY-elasticsearch |sudo gpg --dearmor -o /usr/share/keyrings/elastic.gpg

echo "deb [signed-by=/usr/share/keyrings/elastic.gpg] https://artifacts.elastic.co/packages/7.x/apt stable main" | sudo tee -a /etc/apt/sources.list.d/elastic-7.x.list

sudo apt update

sudo apt install elasticsearch

sudo nano /etc/elasticsearch/elasticsearch.yml

---Network section---
network.host: localhost
http.port: 9200(remove '#' here)

--- Discovery ---
discovery.type: single-node

save config file and exit.

sudo systemctl start elasticsearch

sudo systemctl enable elasticsearch

curl -X GET "localhost:9200"

Output
{
"name" : "Elasticsearch",
"cluster_name" : "elasticsearch",
"cluster_uuid" : "n8Qu5CjWSmyIXBzRXK-j4A",
"version" : {
"number" : "7.17.2",
"build_flavor" : "default",
"build_type" : "deb",
"build_hash" : "de7261de50d90919ae53b0eff9413fd7e5307301",
"build_date" : "2022-03-28T15:12:21.446567561Z",
"build_snapshot" : false,
"lucene_version" : "8.11.1",
"minimum_wire_compatibility_version" : "6.8.0",
"minimum_index_compatibility_version" : "6.0.0-beta1"
},
"tagline" : "You Know, for Search"
}

sudo apt install kibana

sudo systemctl enable kibana

sudo systemctl start kibana

echo "super-kibana:openssl passwd -apr1" | sudo tee -a /etc/nginx/htpasswd.users
asd334^$fa

sudo apt install logstash

sudo vim /etc/logstash/conf.d/communigate-syslog.conf

input {
tcp {
port => 601
type => "communigate"
}
syslog {
port => 5514
type => "syslog"
}
}

filter {
if [type] == "communigate" {
# filter for communigate logs
grok {
match => { "message" => "%{TIMESTAMP_ISO8601:timestamp} %{LOGLEVEL:loglevel} [%{DATA:thread}] %{JAVACLASS:class} - %{GREEDYDATA:logmessage}" }
}
date {
match => [ "timestamp", "ISO8601" ]
}
} else if [type] == "syslog" {
# filter for syslog logs
grok {
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:[%{POSINT:syslog_pid}])?: %{GREEDYDATA:syslog_message}" }
}
date {
match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
}
}

output {
if [type] == "communigate" {
elasticsearch {
hosts => ["localhost:9200"]
index => "communigate-%{+YYYY.MM.dd}"
}
} else if [type] == "syslog" {
elasticsearch {
hosts => ["localhost:9200"]
index => "syslog-%{+YYYY.MM.dd}"
}
}
}

sudo systemctl start logstash
sudo systemctl enable logstash

Configure Kibana:
Open the Kibana configuration file /etc/kibana/kibana.yml in a text editor.
Uncomment the line server.host: and set its value to 0.0.0.0 to allow remote access.
Save and close the file.

sudo systemctl start kibana
sudo systemctl enable kibana

After completing these steps, you should be able to access Kibana at http:// and view the incoming logs from Communigate Pro. You may need to configure the firewall or security groups to allow traffic on the required ports (e.g., 9200, 5514, and 5601).

Please format your code/logs/config using the </> button, or markdown style back ticks. It helps to make things easy to read which helps us help you :slight_smile:

If you go to Kibana -> Index Management, do you see any of your expected indices there? Does it show a Document count, and a Size? If it show more than zero documents, then data is coming into the Elasticsearch storage/database. If it is zero, then it is not surprising that you cannot see anything in Kibana Discover.

If it shows documents in the indices, then it might be because the document lacks the magic @timestamp field. Without that, Kibana will not show anything when you say Last 15 minutes, or even Last 7 days, because without a timestamp, last anything does not mean anything.

What is the mistake I have done.

Installing and Configuring Elasticsearch



curl -fsSL https://artifacts.elastic.co/GPG-KEY-elasticsearch |sudo gpg --dearmor -o /usr/share/keyrings/elastic.gpg

echo "deb [signed-by=/usr/share/keyrings/elastic.gpg] https://artifacts.elastic.co/packages/7.x/apt stable main" | sudo tee -a /etc/apt/sources.list.d/elastic-7.x.list

sudo apt update

sudo apt install elasticsearch

sudo nano /etc/elasticsearch/elasticsearch.yml

---Network section---
network.host: localhost
http.port: 9200(remove '#' here)

--- Discovery ---
discovery.type: single-node

save config file and exit.

sudo systemctl start elasticsearch

sudo systemctl enable elasticsearch

curl -X GET "localhost:9200"

Output
{
"name" : "Elasticsearch",
"cluster_name" : "elasticsearch",
"cluster_uuid" : "n8Qu5CjWSmyIXBzRXK-j4A",
"version" : {
"number" : "7.17.2",
"build_flavor" : "default",
"build_type" : "deb",
"build_hash" : "de7261de50d90919ae53b0eff9413fd7e5307301",
"build_date" : "2022-03-28T15:12:21.446567561Z",
"build_snapshot" : false,
"lucene_version" : "8.11.1",
"minimum_wire_compatibility_version" : "6.8.0",
"minimum_index_compatibility_version" : "6.0.0-beta1"
},
"tagline" : "You Know, for Search"
}

Install Kibana

sudo apt install kibana

sudo systemctl enable kibana

sudo systemctl start kibana

echo "super-kibana:openssl passwd -apr1" | sudo tee -a /etc/nginx/htpasswd.users
asd334^$fa

Configure Kibana:
Open the Kibana configuration file /etc/kibana/kibana.yml in a text editor.
Uncomment the line server.host: and set its value to 0.0.0.0 to allow remote access.
Save and close the file.

Install Logstash

sudo apt install logstash

sudo vim /etc/logstash/conf.d/communigate-syslog.conf

input {
tcp {
port => 601
type => "communigate"
}
syslog {
port => 5514
type => "syslog"
}
}

filter {
if [type] == "communigate" {
# filter for communigate logs
grok {
match => { "message" => "%{TIMESTAMP_ISO8601:timestamp} %{LOGLEVEL:loglevel} [%{DATA:thread}] %{JAVACLASS:class} - %{GREEDYDATA:logmessage}" }
}
date {
match => [ "timestamp", "ISO8601" ]
}
} else if [type] == "syslog" {
# filter for syslog logs
grok {
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:[%{POSINT:syslog_pid}])?: %{GREEDYDATA:syslog_message}" }
}
date {
match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
}
}

output {
if [type] == "communigate" {
elasticsearch {
hosts => ["localhost:9200"]
index => "communigate-%{+YYYY.MM.dd}"
}
} else if [type] == "syslog" {
elasticsearch {
hosts => ["localhost:9200"]
index => "syslog-%{+YYYY.MM.dd}"
}
}
}


sudo systemctl start logstash
sudo systemctl enable logstash

elk@elk:~$ sudo journalctl -u logstash -f
Mar 31 08:20:41 elk logstash[624450]: jdk.internal.reflect.GeneratedMethodAccessor37.invoke(jdk/internal/reflect/GeneratedMethodAccessor37)
Mar 31 08:20:41 elk logstash[624450]: jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(jdk/internal/reflect/DelegatingMethodAccessorImpl.java:43)
Mar 31 08:20:41 elk logstash[624450]: java.lang.reflect.Method.invoke(java/lang/reflect/Method.java:566)
Mar 31 08:20:41 elk logstash[624450]: org.jruby.javasupport.JavaMethod.invokeDirectWithExceptionHandling(org/jruby/javasupport/JavaMethod.java:441)
Mar 31 08:20:41 elk logstash[624450]: org.jruby.javasupport.JavaMethod.invokeDirect(org/jruby/javasupport/JavaMethod.java:305)
Mar 31 08:20:41 elk logstash[624450]: usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_input_minus_tcp_minus_6_dot_2_dot_7_minus_java.lib.logstash.inputs.tcp.run(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-tcp-6.2.7-java/lib/logstash/inputs/tcp.rb:160)
Mar 31 08:20:41 elk logstash[624450]: usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.inputworker(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:410)
Mar 31 08:20:41 elk logstash[624450]: usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.start_input(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:401)
Mar 31 08:20:41 elk logstash[624450]: org.jruby.RubyProc.call(org/jruby/RubyProc.java:318)
Mar 31 08:20:41 elk logstash[624450]: java.lang.Thread.run(java/lang/Thread.java:829)
Mar 31 08:20:42 elk logstash[624450]: [2023-03-31T08:20:42,419][INFO ][logstash.inputs.tcp      ][main][2310d51a6dd433592794f7ed807b38d7ca33af38fb7647a6fccc7fdf2230a5d2] Starting tcp input listener {:address=>"0.0.0.0:601", :ssl_enable=>false}
Mar 31 08:20:42 elk logstash[624450]: [2023-03-31T08:20:42,420][WARN ][io.netty.channel.AbstractChannel][main][2310d51a6dd433592794f7ed807b38d7ca33af38fb7647a6fccc7fdf2230a5d2] Force-closing a channel whose registration task was not accepted by an event loop: [id: 0x7d845b4e]
Mar 31 08:20:42 elk logstash[624450]: java.util.concurrent.RejectedExecutionException: event executor terminated
Mar 31 08:20:42 elk logstash[624450]:         at io.netty.util.concurrent.SingleThreadEventExecutor.reject(SingleThreadEventExecutor.java:926) ~[netty-all-4.1.65.Final.jar:4.1.65.Final]
Mar 31 08:20:42 elk logstash[624450]:         at io.netty.util.concurrent.SingleThreadEventExecutor.offerTask(SingleThreadEventExecutor.java:353) ~[netty-all-4.1.65.Final.jar:4.1.65.Final]
Mar 31 08:20:42 elk logstash[624450]:         at io.netty.util.concurrent.SingleThreadEventExecutor.addTask(SingleThreadEventExecutor.java:346) ~[netty-all-4.1.65.Final.jar:4.1.65.Final]
Mar 31 08:20:42 elk logstash[624450]:         at io.netty.util.concurrent.SingleThreadEventExecutor.execute(SingleThreadEventExecutor.java:828) ~[netty-all-4.1.65.Final.jar:4.1.65.Final]
Mar 31 08:20:42 elk logstash[624450]:         at io.netty.util.concurrent.SingleThreadEventExecutor.execute(SingleThreadEventExecutor.java:818) ~[netty-all-4.1.65.Final.jar:4.1.65.Final]
Mar 31 08:20:42 elk logstash[624450]:         at io.netty.channel.AbstractChannel$AbstractUnsafe.register(AbstractChannel.java:483) [netty-all-4.1.65.Final.jar:4.1.65.Final]
Mar 31 08:20:42 elk logstash[624450]:         at io.netty.channel.SingleThreadEventLoop.register(SingleThreadEventLoop.java:87) [netty-all-4.1.65.Final.jar:4.1.65.Final]
Mar 31 08:20:42 elk logstash[624450]:         at io.netty.channel.SingleThreadEventLoop.register(SingleThreadEventLoop.java:81) [netty-all-4.1.65.Final.jar:4.1.65.Final]
Mar 31 08:20:42 elk logstash[624450]:         at io.netty.channel.MultithreadEventLoopGroup.register(MultithreadEventLoopGroup.java:86) [netty-all-4.1.65.Final.jar:4.1.65.Final]
Mar 31 08:20:42 elk logstash[624450]:         at io.netty.bootstrap.AbstractBootstrap.initAndRegister(AbstractBootstrap.java:323) [netty-all-4.1.65.Final.jar:4.1.65.Final]
Mar 31 08:20:42 elk logstash[624450]:         at io.netty.bootstrap.AbstractBootstrap.doBind(AbstractBootstrap.java:272) [netty-all-4.1.65.Final.jar:4.1.65.Final]
Mar 31 08:20:42 elk logstash[624450]:         at io.netty.bootstrap.AbstractBootstrap.bind(AbstractBootstrap.java:268) [netty-all-4.1.65.Final.jar:4.1.65.Final]
Mar 31 08:20:42 elk logstash[624450]:         at io.netty.bootstrap.AbstractBootstrap.bind(AbstractBootstrap.java:253) [netty-all-4.1.65.Final.jar:4.1.65.Final]
Mar 31 08:20:42 elk logstash[624450]:         at org.logstash.tcp.InputLoop.run(InputLoop.java:86) [logstash-input-tcp-6.2.7.jar:?]
Mar 31 08:20:42 elk logstash[624450]:         at jdk.internal.reflect.GeneratedMethodAccessor37.invoke(Unknown Source) ~[?:?]
Mar 31 08:20:42 elk logstash[624450]:         at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]
Mar 31 08:20:42 elk logstash[624450]:         at java.lang.reflect.Method.invoke(Method.java:566) ~[?:?]
Mar 31 08:20:42 elk logstash[624450]:         at org.jruby.javasupport.JavaMethod.invokeDirectWithExceptionHandling(JavaMethod.java:441) [jruby-complete-9.2.20.1.jar:?]
Mar 31 08:20:42 elk logstash[624450]:         at org.jruby.javasupport.JavaMethod.invokeDirect(JavaMethod.java:305) [jruby-complete-9.2.20.1.jar:?]
Mar 31 08:20:42 elk logstash[624450]:         at org.jruby.java.invokers.InstanceMethodInvoker.call(InstanceMethodInvoker.java:32) [jruby-complete-9.2.20.1.jar:?]
Mar 31 08:20:42 elk logstash[624450]:         at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_input_minus_tcp_minus_6_dot_2_dot_7_minus_java.lib.logstash.inputs.tcp.RUBY$method$run$0(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-tcp-6.2.7-java/lib/logstash/inputs/tcp.rb:160) [jruby-complete-9.2.20.1.jar:?]
Mar 31 08:20:42 elk logstash[624450]:         at usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.RUBY$method$inputworker$0(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:410) [jruby-complete-9.2.20.1.jar:?]
Mar 31 08:20:42 elk logstash[624450]:         at usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.RUBY$method$inputworker$0$__VARARGS__(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:405) [jruby-complete-9.2.20.1.jar:?]
Mar 31 08:20:42 elk logstash[624450]:         at org.jruby.internal.runtime.methods.CompiledIRMethod.call(CompiledIRMethod.java:80) [jruby-complete-9.2.20.1.jar:?]
Mar 31 08:20:42 elk logstash[624450]:         at org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:70) [jruby-complete-9.2.20.1.jar:?]
Mar 31 08:20:42 elk logstash[624450]:         at org.jruby.ir.targets.InvokeSite.invoke(InvokeSite.java:207) [jruby-complete-9.2.20.1.jar:?]
Mar 31 08:20:42 elk logstash[624450]:         at usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.RUBY$block$start_input$1(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:401) [jruby-complete-9.2.20.1.jar:?]
Mar 31 08:20:42 elk logstash[624450]:         at org.jruby.runtime.CompiledIRBlockBody.callDirect(CompiledIRBlockBody.java:138) [jruby-complete-9.2.20.1.jar:?]
Mar 31 08:20:42 elk logstash[624450]:         at org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:58) [jruby-complete-9.2.20.1.jar:?]
Mar 31 08:20:42 elk logstash[624450]:         at org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:52) [jruby-complete-9.2.20.1.jar:?]
Mar 31 08:20:42 elk logstash[624450]:         at org.jruby.runtime.Block.call(Block.java:139) [jruby-complete-9.2.20.1.jar:?]
Mar 31 08:20:42 elk logstash[624450]:         at org.jruby.RubyProc.call(RubyProc.java:318) [jruby-complete-9.2.20.1.jar:?]
Mar 31 08:20:42 elk logstash[624450]:         at org.jruby.internal.runtime.RubyRunnable.run(RubyRunnable.java:105) [jruby-complete-9.2.20.1.jar:?]
Mar 31 08:20:42 elk logstash[624450]:         at java.lang.Thread.run(Thread.java:829) [?:?]
Mar 31 08:20:42 elk logstash[624450]: [2023-03-31T08:20:42,421][ERROR][logstash.javapipeline    ][main][2310d51a6dd433592794f7ed807b38d7ca33af38fb7647a6fccc7fdf2230a5d2] A plugin had an unrecoverable error. Will restart this plugin.
Mar 31 08:20:42 elk logstash[624450]:   Pipeline_id:main
Mar 31 08:20:42 elk logstash[624450]:   Plugin: <LogStash::Inputs::Tcp type=>"communigate", port=>601, id=>"2310d51a6dd433592794f7ed807b38d7ca33af38fb7647a6fccc7fdf2230a5d2", enable_metric=>true, codec=><LogStash::Codecs::Line id=>"line_231b2918-dc95-4438-9671-8e984c4c48b7", enable_metric=>true, charset=>"UTF-8", delimiter=>"\n">, host=>"0.0.0.0", mode=>"server", proxy_protocol=>false, ssl_enable=>false, ssl_verify=>true, ssl_key_passphrase=><password>, tcp_keep_alive=>false, dns_reverse_lookup_enabled=>true>
Mar 31 08:20:42 elk logstash[624450]:   Error: event executor terminated
Mar 31 08:20:42 elk logstash[624450]:   Exception: Java::JavaUtilConcurrent::RejectedExecutionException
Mar 31 08:20:42 elk logstash[624450]:   Stack: io.netty.util.concurrent.SingleThreadEventExecutor.reject(io/netty/util/concurrent/SingleThreadEventExecutor.java:926)
Mar 31 08:20:42 elk logstash[624450]: io.netty.util.concurrent.SingleThreadEventExecutor.offerTask(io/netty/util/concurrent/SingleThreadEventExecutor.java:353)
Mar 31 08:20:42 elk logstash[624450]: io.netty.util.concurrent.SingleThreadEventExecutor.addTask(io/netty/util/concurrent/SingleThreadEventExecutor.java:346)
Mar 31 08:20:42 elk logstash[624450]: io.netty.util.concurrent.SingleThreadEventExecutor.execute(io/netty/util/concurrent/SingleThreadEventExecutor.java:828)
Mar 31 08:20:42 elk logstash[624450]: io.netty.util.concurrent.SingleThreadEventExecutor.execute(io/netty/util/concurrent/SingleThreadEventExecutor.java:818)
Mar 31 08:20:42 elk logstash[624450]: io.netty.channel.AbstractChannel$AbstractUnsafe.register(io/netty/channel/AbstractChannel.java:483)
Mar 31 08:20:42 elk logstash[624450]: io.netty.channel.SingleThreadEventLoop.register(io/netty/channel/SingleThreadEventLoop.java:87)
Mar 31 08:20:42 elk logstash[624450]: io.netty.channel.SingleThreadEventLoop.register(io/netty/channel/SingleThreadEventLoop.java:81)
Mar 31 08:20:42 elk logstash[624450]: io.netty.channel.MultithreadEventLoopGroup.register(io/netty/channel/MultithreadEventLoopGroup.java:86)
Mar 31 08:20:42 elk logstash[624450]: io.netty.bootstrap.AbstractBootstrap.initAndRegister(io/netty/bootstrap/AbstractBootstrap.java:323)
Mar 31 08:20:42 elk logstash[624450]: io.netty.bootstrap.AbstractBootstrap.doBind(io/netty/bootstrap/AbstractBootstrap.java:272)
Mar 31 08:20:42 elk logstash[624450]: io.netty.bootstrap.AbstractBootstrap.bind(io/netty/bootstrap/AbstractBootstrap.java:268)
Mar 31 08:20:42 elk logstash[624450]: io.netty.bootstrap.AbstractBootstrap.bind(io/netty/bootstrap/AbstractBootstrap.java:253)
Mar 31 08:20:42 elk logstash[624450]: org.logstash.tcp.InputLoop.run(org/logstash/tcp/InputLoop.java:86)
Mar 31 08:20:42 elk logstash[624450]: jdk.internal.reflect.GeneratedMethodAccessor37.invoke(jdk/internal/reflect/GeneratedMethodAccessor37)
Mar 31 08:20:42 elk logstash[624450]: jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(jdk/internal/reflect/DelegatingMethodAccessorImpl.java:43)
Mar 31 08:20:42 elk logstash[624450]: java.lang.reflect.Method.invoke(java/lang/reflect/Method.java:566)
Mar 31 08:20:42 elk logstash[624450]: org.jruby.javasupport.JavaMethod.invokeDirectWithExceptionHandling(org/jruby/javasupport/JavaMethod.java:441)
Mar 31 08:20:42 elk logstash[624450]: org.jruby.javasupport.JavaMethod.invokeDirect(org/jruby/javasupport/JavaMethod.java:305)
Mar 31 08:20:42 elk logstash[624450]: usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_input_minus_tcp_minus_6_dot_2_dot_7_minus_java.lib.logstash.inputs.tcp.run(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-tcp-6.2.7-java/lib/logstash/inputs/tcp.rb:160)
Mar 31 08:20:42 elk logstash[624450]: usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.inputworker(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:410)
Mar 31 08:20:42 elk logstash[624450]: usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.start_input(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:401)
Mar 31 08:20:42 elk logstash[624450]: org.jruby.RubyProc.call(org/jruby/RubyProc.java:318)
Mar 31 08:20:42 elk logstash[624450]: java.lang.Thread.run(java/lang/Thread.java:829)
Mar 31 08:20:43 elk logstash[624450]: [2023-03-31T08:20:43,422][INFO ][logstash.inputs.tcp      ][main][2310d51a6dd433592794f7ed807b38d7ca33af38fb7647a6fccc7fdf2230a5d2] Starting tcp input listener {:address=>"0.0.0.0:601", :ssl_enable=>false}
Mar 31 08:20:43 elk logstash[624450]: [2023-03-31T08:20:43,422][WARN ][io.netty.channel.AbstractChannel][main][2310d51a6dd433592794f7ed807b38d7ca33af38fb7647a6fccc7fdf2230a5d2] Force-closing a channel whose registration task was not accepted by an event loop: [id: 0x5ed5ebb0]
Mar 31 08:20:43 elk logstash[624450]: java.util.concurrent.RejectedExecutionException: event executor terminated
Mar 31 08:20:43 elk logstash[624450]:         at io.netty.util.concurrent.SingleThreadEventExecutor.reject(SingleThreadEventExecutor.java:926) ~[netty-all-4.1.65.Final.jar:4.1.65.Final]
Mar 31 08:20:43 elk logstash[624450]:         at io.netty.util.concurrent.SingleThreadEventExecutor.offerTask(SingleThreadEventExecutor.java:353) ~[netty-all-4.1.65.Final.jar:4.1.65.Final]
Mar 31 08:20:43 elk logstash[624450]:         at io.netty.util.concurrent.SingleThreadEventExecutor.addTask(SingleThreadEventExecutor.java:346) ~[netty-all-4.1.65.Final.jar:4.1.65.Final]
Mar 31 08:20:43 elk logstash[624450]:         at io.netty.util.concurrent.SingleThreadEventExecutor.execute(SingleThreadEventExecutor.java:828) ~[netty-all-4.1.65.Final.jar:4.1.65.Final]
Mar 31 08:20:43 elk logstash[624450]:         at io.netty.util.concurrent.SingleThreadEventExecutor.execute(SingleThreadEventExecutor.java:818) ~[netty-all-4.1.65.Final.jar:4.1.65.Final]
Mar 31 08:20:43 elk logstash[624450]:         at io.netty.channel.AbstractChannel$AbstractUnsafe.register(AbstractChannel.java:483) [netty-all-4.1.65.Final.jar:4.1.65.Final]
Mar 31 08:20:43 elk logstash[624450]:         at io.netty.channel.SingleThreadEventLoop.register(SingleThreadEventLoop.java:87) [netty-all-4.1.65.Final.jar:4.1.65.Final]
Mar 31 08:20:43 elk logstash[624450]:         at io.netty.channel.SingleThreadEventLoop.register(SingleThreadEventLoop.java:81) [netty-all-4.1.65.Final.jar:4.1.65.Final]
Mar 31 08:20:43 elk logstash[624450]:         at io.netty.channel.MultithreadEventLoopGroup.register(MultithreadEventLoopGroup.java:86) [netty-all-4.1.65.Final.jar:4.1.65.Final]
Mar 31 08:20:43 elk logstash[624450]:         at io.netty.bootstrap.AbstractBootstrap.initAndRegister(AbstractBootstrap.java:323) [netty-all-4.1.65.Final.jar:4.1.65.Final]
Mar 31 08:20:43 elk logstash[624450]:         at io.netty.bootstrap.AbstractBootstrap.doBind(AbstractBootstrap.java:272) [netty-all-4.1.65.Final.jar:4.1.65.Final]
Mar 31 08:20:43 elk logstash[624450]:         at io.netty.bootstrap.AbstractBootstrap.bind(AbstractBootstrap.java:268) [netty-all-4.1.65.Final.jar:4.1.65.Final]
Mar 31 08:20:43 elk logstash[624450]:         at io.netty.bootstrap.AbstractBootstrap.bind(AbstractBootstrap.java:253) [netty-all-4.1.65.Final.jar:4.1.65.Final]
Mar 31 08:20:43 elk logstash[624450]:         at org.logstash.tcp.InputLoop.run(InputLoop.java:86) [logstash-input-tcp-6.2.7.jar:?]
Mar 31 08:20:43 elk logstash[624450]:         at jdk.internal.reflect.GeneratedMethodAccessor37.invoke(Unknown Source) ~[?:?]
Mar 31 08:20:43 elk logstash[624450]:         at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]
Mar 31 08:20:43 elk logstash[624450]:         at java.lang.reflect.Method.invoke(Method.java:566) ~[?:?]
Mar 31 08:20:43 elk logstash[624450]:         at org.jruby.javasupport.JavaMethod.invokeDirectWithExceptionHandling(JavaMethod.java:441) [jruby-complete-9.2.20.1.jar:?]
Mar 31 08:20:43 elk logstash[624450]:         at org.jruby.javasupport.JavaMethod.invokeDirect(JavaMethod.java:305) [jruby-complete-9.2.20.1.jar:?]
Mar 31 08:20:43 elk logstash[624450]:         at org.jruby.java.invokers.InstanceMethodInvoker.call(InstanceMethodInvoker.java:32) [jruby-complete-9.2.20.1.jar:?]
Mar 31 08:20:43 elk logstash[624450]:         at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_input_minus_tcp_minus_6_dot_2_dot_7_minus_java.lib.logstash.inputs.tcp.RUBY$method$run$0(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-tcp-6.2.7-java/lib/logstash/inputs/tcp.rb:160) [jruby-complete-9.2.20.1.jar:?]
Mar 31 08:20:43 elk logstash[624450]:         at usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.RUBY$method$inputworker$0(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:410) [jruby-complete-9.2.20.1.jar:?]
Mar 31 08:20:43 elk logstash[624450]:         at usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.RUBY$method$inputworker$0$__VARARGS__(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:405) [jruby-complete-9.2.20.1.jar:?]
Mar 31 08:20:43 elk logstash[624450]:         at org.jruby.internal.runtime.methods.CompiledIRMethod.call(CompiledIRMethod.java:80) [jruby-complete-9.2.20.1.jar:?]
Mar 31 08:20:43 elk logstash[624450]:         at org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:70) [jruby-complete-9.2.20.1.jar:?]
Mar 31 08:20:43 elk logstash[624450]:         at org.jruby.ir.targets.InvokeSite.invoke(InvokeSite.java:207) [jruby-complete-9.2.20.1.jar:?]
Mar 31 08:20:43 elk logstash[624450]:         at usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.RUBY$block$start_input$1(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:401) [jruby-complete-9.2.20.1.jar:?]
Mar 31 08:20:43 elk logstash[624450]:         at org.jruby.runtime.CompiledIRBlockBody.callDirect(CompiledIRBlockBody.java:138) [jruby-complete-9.2.20.1.jar:?]
Mar 31 08:20:43 elk logstash[624450]:         at org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:58) [jruby-complete-9.2.20.1.jar:?]
Mar 31 08:20:43 elk logstash[624450]:         at org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:52) [jruby-complete-9.2.20.1.jar:?]
Mar 31 08:20:43 elk logstash[624450]:         at org.jruby.runtime.Block.call(Block.java:139) [jruby-complete-9.2.20.1.jar:?]
Mar 31 08:20:43 elk logstash[624450]:         at org.jruby.RubyProc.call(RubyProc.java:318) [jruby-complete-9.2.20.1.jar:?]
Mar 31 08:20:43 elk logstash[624450]:         at org.jruby.internal.runtime.RubyRunnable.run(RubyRunnable.java:105) [jruby-complete-9.2.20.1.jar:?]
Mar 31 08:20:43 elk logstash[624450]:         at java.lang.Thread.run(Thread.java:829) [?:?]
Mar 31 08:20:43 elk logstash[624450]: [2023-03-31T08:20:43,424][ERROR][logstash.javapipeline    ][main][2310d51a6dd433592794f7ed807b38d7ca33af38fb7647a6fccc7fdf2230a5d2] A plugin had an unrecoverable error. Will restart this plugin.
Mar 31 08:20:43 elk logstash[624450]:   Pipeline_id:main
Mar 31 08:20:43 elk logstash[624450]:   Plugin: <LogStash::Inputs::Tcp type=>"communigate", port=>601, id=>"2310d51a6dd433592794f7ed807b38d7ca33af38fb7647a6fccc7fdf2230a5d2", enable_metric=>true, codec=><LogStash::Codecs::Line id=>"line_231b2918-dc95-4438-9671-8e984c4c48b7", enable_metric=>true, charset=>"UTF-8", delimiter=>"\n">, host=>"0.0.0.0", mode=>"server", proxy_protocol=>false, ssl_enable=>false, ssl_verify=>true, ssl_key_passphrase=><password>, tcp_keep_alive=>false, dns_reverse_lookup_enabled=>true>
Mar 31 08:20:43 elk logstash[624450]:   Error: event executor terminated
Mar 31 08:20:43 elk logstash[624450]:   Exception: Java::JavaUtilConcurrent::RejectedExecutionException
Mar 31 08:20:43 elk logstash[624450]:   Stack: io.netty.util.concurrent.SingleThreadEventExecutor.reject(io/netty/util/concurrent/SingleThreadEventExecutor.java:926)
Mar 31 08:20:43 elk logstash[624450]: io.netty.util.concurrent.SingleThreadEventExecutor.offerTask(io/netty/util/concurrent/SingleThreadEventExecutor.java:353)
Mar 31 08:20:43 elk logstash[624450]: io.netty.util.concurrent.SingleThreadEventExecutor.addTask(io/netty/util/concurrent/SingleThreadEventExecutor.java:346)
Mar 31 08:20:43 elk logstash[624450]: io.netty.util.concurrent.SingleThreadEventExecutor.execute(io/netty/util/concurrent/SingleThreadEventExecutor.java:828)
Mar 31 08:20:43 elk logstash[624450]: io.netty.util.concurrent.SingleThreadEventExecutor.execute(io/netty/util/concurrent/SingleThreadEventExecutor.java:818)
Mar 31 08:20:43 elk logstash[624450]: io.netty.channel.AbstractChannel$AbstractUnsafe.register(io/netty/channel/AbstractChannel.java:483)
Mar 31 08:20:43 elk logstash[624450]: io.netty.channel.SingleThreadEventLoop.register(io/netty/channel/SingleThreadEventLoop.java:87)
Mar 31 08:20:43 elk logstash[624450]: io.netty.channel.SingleThreadEventLoop.register(io/netty/channel/SingleThreadEventLoop.java:81)
Mar 31 08:20:43 elk logstash[624450]: io.netty.channel.MultithreadEventLoopGroup.register(io/netty/channel/MultithreadEventLoopGroup.java:86)
Mar 31 08:20:43 elk logstash[624450]: io.netty.bootstrap.AbstractBootstrap.initAndRegister(io/netty/bootstrap/AbstractBootstrap.java:323)
Mar 31 08:20:43 elk logstash[624450]: io.netty.bootstrap.AbstractBootstrap.doBind(io/netty/bootstrap/AbstractBootstrap.java:272)
Mar 31 08:20:43 elk logstash[624450]: io.netty.bootstrap.AbstractBootstrap.bind(io/netty/bootstrap/AbstractBootstrap.java:268)
Mar 31 08:20:43 elk logstash[624450]: io.netty.bootstrap.AbstractBootstrap.bind(io/netty/bootstrap/AbstractBootstrap.java:253)
Mar 31 08:20:43 elk logstash[624450]: org.logstash.tcp.InputLoop.run(org/logstash/tcp/InputLoop.java:86)
Mar 31 08:20:43 elk logstash[624450]: jdk.internal.reflect.GeneratedMethodAccessor37.invoke(jdk/internal/reflect/GeneratedMethodAccessor37)
Mar 31 08:20:43 elk logstash[624450]: jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(jdk/internal/reflect/DelegatingMethodAccessorImpl.java:43)
Mar 31 08:20:43 elk logstash[624450]: java.lang.reflect.Method.invoke(java/lang/reflect/Method.java:566)
Mar 31 08:20:43 elk logstash[624450]: org.jruby.javasupport.JavaMethod.invokeDirectWithExceptionHandling(org/jruby/javasupport/JavaMethod.java:441)
Mar 31 08:20:43 elk logstash[624450]: org.jruby.javasupport.JavaMethod.invokeDirect(org/jruby/javasupport/JavaMethod.java:305)
Mar 31 08:20:43 elk logstash[624450]: usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_input_minus_tcp_minus_6_dot_2_dot_7_minus_java.lib.logstash.inputs.tcp.run(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-tcp-6.2.7-java/lib/logstash/inputs/tcp.rb:160)
Mar 31 08:20:43 elk logstash[624450]: usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.inputworker(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:410)
Mar 31 08:20:43 elk logstash[624450]: usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.start_input(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:401)
Mar 31 08:20:43 elk logstash[624450]: org.jruby.RubyProc.call(org/jruby/RubyProc.java:318)
Mar 31 08:20:43 elk logstash[624450]: java.lang.Thread.run(java/lang/Thread.java:829)

How should I rectify this issue?

elk@elk:/var/log/logstash$ tail -f logstash-plain.log
[2023-03-31T08:33:09,536][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2023-03-31T08:33:09,536][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2023-03-31T08:33:10,662][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2023-03-31T08:33:14,159][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2023-03-31T08:33:14,545][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2023-03-31T08:33:14,546][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2023-03-31T08:33:15,662][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2023-03-31T08:33:19,161][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2023-03-31T08:33:19,558][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2023-03-31T08:33:19,559][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2023-03-31T08:33:20,662][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2023-03-31T08:33:24,164][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2023-03-31T08:33:24,568][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2023-03-31T08:33:24,569][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2023-03-31T08:33:25,662][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2023-03-31T08:33:29,167][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2023-03-31T08:33:29,579][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2023-03-31T08:33:29,579][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2023-03-31T08:33:30,662][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2023-03-31T08:33:34,170][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2023-03-31T08:33:34,588][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2023-03-31T08:33:34,588][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2023-03-31T08:33:35,662][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2023-03-31T08:33:39,172][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2023-03-31T08:33:39,599][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2023-03-31T08:33:39,599][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}

It seems that your logstash is not starting correctly, probably because of the configuration of your TCP input configuration.

tcp {
    port => 601
    type => "communigate"
}

This is a low port, ports below 1024 are reserved to the root user, logstash uses the _logstash_user, so it cannot bind to this port 601.

Try to change the port to something like 1601 and change your soure to send logs to this port and see if the error stops

root@elk:/etc# cat /etc/logstash/conf.d/communigate-syslog.conf
input {
  tcp {
    port => 1601
    type => "communigate"
    codec => multiline {
      pattern => "^%{TIMESTAMP_ISO8601} "
      negate => true
      what => "previous"
    }
  }
  syslog {
    port => 5514
    type => "syslog"
    codec => multiline {
      pattern => "^%{SYSLOGTIMESTAMP}"
      negate => true
      what => "previous"
    }
  }
}

filter {
  if [type] == "communigate" {
    # filter for communigate logs
    grok {
      match => { "message" => "%{TIMESTAMP_ISO8601:timestamp} %{LOGLEVEL:loglevel} \[%{DATA:thread}\] %{JAVACLASS:class} - %{GREEDYDATA:logmessage}" }
    }
    date {
      match => [ "timestamp", "ISO8601" ]
    }
  } else if [type] == "syslog" {
    # filter for syslog logs
    grok {
      match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}" }
    }
    date {
      match => [ "syslog_timestamp", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]
    }
  }
}

output {
  if [type] == "communigate" {
    elasticsearch {
      hosts => ["localhost:9200"]
      index => "communigate-%{+YYYY.MM.dd}"
    }
  } else if [type] == "syslog" {
    elasticsearch {
      hosts => ["localhost:9200"]
      index => "syslog-%{+YYYY.MM.dd}"
    }
  }
}
root@elk:/etc#

Some part of the logs of tail -f /var/log/logstash/logstash-plain.log

th.data" setting.
[2023-03-31T10:06:55,246][FATAL][org.logstash.Logstash    ] Logstash stopped processing because of an error: (SystemExit) exit
org.jruby.exceptions.SystemExit: (SystemExit) exit
        at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:747) ~[jruby-complete-9.2.20.1.jar:?]
        at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:710) ~[jruby-complete-9.2.20.1.jar:?]
        at usr.share.logstash.lib.bootstrap.environment.<main>(/usr/share/logstash/lib/bootstrap/environment.rb:94) ~[?:?]
[2023-03-31T10:06:57,162][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2023-03-31T10:06:57,695][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2023-03-31T10:07:00,238][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2023-03-31T10:07:00,238][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2023-03-31T10:07:02,164][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2023-03-31T10:07:02,695][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2023-03-31T10:07:05,242][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2023-03-31T10:07:05,242][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2023-03-31T10:07:07,165][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2023-03-31T10:07:07,695][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2023-03-31T10:07:10,246][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2023-03-31T10:07:10,246][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2023-03-31T10:07:12,166][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2023-03-31T10:07:12,695][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2023-03-31T10:07:15,249][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2023-03-31T10:07:15,249][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2023-03-31T10:07:17,167][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2023-03-31T10:07:17,695][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2023-03-31T10:07:20,253][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2023-03-31T10:07:20,253][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2023-03-31T10:07:22,168][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2023-03-31T10:07:22,695][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2023-03-31T10:07:25,257][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2023-03-31T10:07:25,258][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2023-03-31T10:07:27,169][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2023-03-31T10:07:27,695][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2023-03-31T10:07:30,262][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2023-03-31T10:07:30,263][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2023-03-31T10:07:32,170][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2023-03-31T10:07:32,695][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2023-03-31T10:07:35,268][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2023-03-31T10:07:35,268][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2023-03-31T10:07:37,171][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2023-03-31T10:07:37,695][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2023-03-31T10:07:40,271][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2023-03-31T10:07:40,271][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2023-03-31T10:07:42,172][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2023-03-31T10:07:42,695][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2023-03-31T10:07:45,274][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2023-03-31T10:07:45,274][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2023-03-31T10:07:47,173][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2023-03-31T10:07:47,695][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2023-03-31T10:07:50,278][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2023-03-31T10:07:50,278][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2023-03-31T10:07:52,174][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2023-03-31T10:07:52,695][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2023-03-31T10:07:55,282][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2023-03-31T10:07:55,282][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2023-03-31T10:07:57,184][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2023-03-31T10:07:57,695][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2023-03-31T10:08:34,301][INFO ][logstash.runner          ] Log4j configuration path used is: /etc/logstash/log4j2.properties
[2023-03-31T10:08:34,311][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.17.9", "jruby.version"=>"jruby 9.2.20.1 (2.5.8) 2021-11-30 2a2962fbd1 OpenJDK 64-Bit Server VM 11.0.18+10 on 11.0.18+10 +indy +jit [linux-x86_64]"}
[2023-03-31T10:08:34,317][INFO ][logstash.runner          ] JVM bootstrap flags: [-Xms1g, -Xmx1g, -XX:+UseConcMarkSweepGC, -XX:CMSInitiatingOccupancyFraction=75, -XX:+UseCMSInitiatingOccupancyOnly, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djdk.io.File.enableADS=true, -Djruby.compile.invokedynamic=true, -Djruby.jit.threshold=0, -Djruby.regexp.interruptible=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true]
[2023-03-31T10:08:35,700][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
[2023-03-31T10:08:37,626][INFO ][org.reflections.Reflections] Reflections took 88 ms to scan 1 urls, producing 119 keys and 419 values
[2023-03-31T10:08:39,395][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2023-03-31T10:08:39,694][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2023-03-31T10:08:39,890][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2023-03-31T10:08:39,901][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch version determined (7.17.9) {:es_version=>7}
[2023-03-31T10:08:39,903][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2023-03-31T10:08:39,999][INFO ][logstash.outputs.elasticsearch][main] Config is not compliant with data streams. `data_stream => auto` resolved to `false`
[2023-03-31T10:08:40,000][INFO ][logstash.outputs.elasticsearch][main] Config is not compliant with data streams. `data_stream => auto` resolved to `false`
[2023-03-31T10:08:40,003][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2023-03-31T10:08:40,023][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2023-03-31T10:08:40,038][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2023-03-31T10:08:40,043][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch version determined (7.17.9) {:es_version=>7}
[2023-03-31T10:08:40,044][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2023-03-31T10:08:40,052][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>7, :ecs_compatibility=>:disabled}
[2023-03-31T10:08:40,083][INFO ][logstash.outputs.elasticsearch][main] Config is not compliant with data streams. `data_stream => auto` resolved to `false`
[2023-03-31T10:08:40,083][INFO ][logstash.outputs.elasticsearch][main] Config is not compliant with data streams. `data_stream => auto` resolved to `false`
[2023-03-31T10:08:40,093][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>7, :ecs_compatibility=>:disabled}
[2023-03-31T10:08:40,278][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>12, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1500, "pipeline.sources"=>["/etc/logstash/conf.d/communigate-syslog.conf"], :thread=>"#<Thread:0x3c7d98eb run>"}
[2023-03-31T10:08:41,393][INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>1.11}
[2023-03-31T10:08:41,619][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2023-03-31T10:08:41,624][INFO ][logstash.inputs.tcp      ][main][8c1acdae7fa7e3e5188de6b8fd6590e6bf5015be55235c807722c5d9a9ed2849] Starting tcp input listener {:address=>"0.0.0.0:1601", :ssl_enable=>false}
[2023-03-31T10:08:41,641][INFO ][logstash.inputs.syslog   ][main][337362fefa288f7e26c1c0ba3e17a7bfbf96b8754e07421ebc926164397e8595] Starting syslog tcp listener {:address=>"0.0.0.0:5514"}
[2023-03-31T10:08:41,654][INFO ][logstash.inputs.syslog   ][main][337362fefa288f7e26c1c0ba3e17a7bfbf96b8754e07421ebc926164397e8595] Starting syslog udp listener {:address=>"0.0.0.0:5514"}
[2023-03-31T10:08:41,699][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2023-03-31T10:09:01,092][INFO ][logstash.inputs.syslog   ][main][337362fefa288f7e26c1c0ba3e17a7bfbf96b8754e07421ebc926164397e8595] new connection {:client=>"172.20.111.149:46388"}
[2023-03-31T10:24:31,159][INFO ][logstash.inputs.syslog   ][main][337362fefa288f7e26c1c0ba3e17a7bfbf96b8754e07421ebc926164397e8595] new connection {:client=>"172.20.111.149:35680"}
root@elk:/etc# tail -f /var/log/logstash/logstash-plain.log
[2023-03-31T10:08:40,093][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>7, :ecs_compatibility=>:disabled}
[2023-03-31T10:08:40,278][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>12, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1500, "pipeline.sources"=>["/etc/logstash/conf.d/communigate-syslog.conf"], :thread=>"#<Thread:0x3c7d98eb run>"}
[2023-03-31T10:08:41,393][INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>1.11}
[2023-03-31T10:08:41,619][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2023-03-31T10:08:41,624][INFO ][logstash.inputs.tcp      ][main][8c1acdae7fa7e3e5188de6b8fd6590e6bf5015be55235c807722c5d9a9ed2849] Starting tcp input listener {:address=>"0.0.0.0:1601", :ssl_enable=>false}
[2023-03-31T10:08:41,641][INFO ][logstash.inputs.syslog   ][main][337362fefa288f7e26c1c0ba3e17a7bfbf96b8754e07421ebc926164397e8595] Starting syslog tcp listener {:address=>"0.0.0.0:5514"}
[2023-03-31T10:08:41,654][INFO ][logstash.inputs.syslog   ][main][337362fefa288f7e26c1c0ba3e17a7bfbf96b8754e07421ebc926164397e8595] Starting syslog udp listener {:address=>"0.0.0.0:5514"}
[2023-03-31T10:08:41,699][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2023-03-31T10:09:01,092][INFO ][logstash.inputs.syslog   ][main][337362fefa288f7e26c1c0ba3e17a7bfbf96b8754e07421ebc926164397e8595] new connection {:client=>"172.20.111.149:46388"}
[2023-03-31T10:24:31,159][INFO ][logstash.inputs.syslog   ][main][337362fefa288f7e26c1c0ba3e17a7bfbf96b8754e07421ebc926164397e8595] new connection {:client=>"172.20.111.149:35680"}

It looks OK now.

[2023-03-31T10:08:41,624][INFO ][logstash.inputs.tcp      ][main][8c1acdae7fa7e3e5188de6b8fd6590e6bf5015be55235c807722c5d9a9ed2849] Starting tcp input listener {:address=>"0.0.0.0:1601", :ssl_enable=>false}
[2023-03-31T10:08:41,641][INFO ][logstash.inputs.syslog   ][main][337362fefa288f7e26c1c0ba3e17a7bfbf96b8754e07421ebc926164397e8595] Starting syslog tcp listener {:address=>"0.0.0.0:5514"}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.