Hey Stephen,
I commented out all the other pipelines, leaving only the eslogs available and I still dont get any events..
[2021-02-19T16:26:22,132][INFO ][logstash.javapipeline ][eslogs] Pipeline Java execution initialization time {"seconds"=>1.2}
[2021-02-19T16:26:22,314][TRACE][logstash.inputs.file ][eslogs] Registering file input {:path=>["/var/log/elasticsearch/*.log"]}
[2021-02-19T16:26:22,361][INFO ][logstash.inputs.file ][eslogs] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/var/lib/logstash/plugins/inputs/file/.sincedb_5cd297ac93ac4edccc765c8fc26ed0c5", :path=>["/var/log/elasticsearch/*.log"]}
[2021-02-19T16:26:22,381][INFO ][logstash.javapipeline ][eslogs] Pipeline started {"pipeline.id"=>"eslogs"}
[2021-02-19T16:26:22,390][DEBUG][logstash.javapipeline ] Pipeline started successfully {:pipeline_id=>"eslogs", :thread=>"#<Thread:0x1fb42130 run>"}
[2021-02-19T16:26:22,392][DEBUG][org.logstash.execution.PeriodicFlush][eslogs] Pushing flush onto pipeline.
[2021-02-19T16:26:22,407][TRACE][logstash.agent ] Converge results {:success=>true, :failed_actions=>[], :successful_actions=>["id: eslogs, action_type: LogStash::PipelineAction::Create"]}
[2021-02-19T16:26:22,432][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:eslogs], :non_running_pipelines=>[]}
[2021-02-19T16:26:22,407][TRACE][logstash.agent ] Converge results {:success=>true, :failed_actions=>[], :successful_actions=>["id: eslogs, action_type: LogStash::PipelineAction::Create"]}
[2021-02-19T16:26:22,432][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:eslogs], :non_running_pipelines=>[]}
[2021-02-19T16:26:22,437][INFO ][filewatch.observingtail ][eslogs][0a6bbd6a4f1c0e1f61a82480deb9812aca25fc4036a5a3706e31d72f0f190422] START, creating Discoverer, Watch with file and sincedb collections
[2021-02-19T16:26:22,463][DEBUG][logstash.agent ] Starting puma
[2021-02-19T16:26:22,477][DEBUG][filewatch.sincedbcollection][eslogs][0a6bbd6a4f1c0e1f61a82480deb9812aca25fc4036a5a3706e31d72f0f190422] open: reading from /var/lib/logstash/plugins/inputs/file/.sincedb_5cd297ac93ac4edccc765c8fc26ed0c5
[2021-02-19T16:26:22,482][DEBUG][logstash.agent ] Trying to start WebServer {:port=>9600}
[2021-02-19T16:26:22,498][TRACE][filewatch.sincedbcollection][eslogs][0a6bbd6a4f1c0e1f61a82480deb9812aca25fc4036a5a3706e31d72f0f190422] open: importing #<struct FileWatch::InodeStruct inode="201326700", maj=0, min=66305> => #<FileWatch::SincedbValue:0x6a312d4f @last_changed_at=1613764593.460017, @path_in_sincedb="/var/log/elasticsearch/elastic-cluster_index_indexing_slowlog.log", @watched_file=nil, @position=0>
[2021-02-19T16:26:22,506][TRACE][filewatch.sincedbcollection][eslogs][0a6bbd6a4f1c0e1f61a82480deb9812aca25fc4036a5a3706e31d72f0f190422] open: importing #<struct FileWatch::InodeStruct inode="201326701", maj=0, min=66305> => #<FileWatch::SincedbValue:0x36ec659a @last_changed_at=1613764593.472921, @path_in_sincedb="/var/log/elasticsearch/elastic-cluster_index_search_slowlog.log", @watched_file=nil, @position=0>
[2021-02-19T16:26:22,507][TRACE][filewatch.sincedbcollection][eslogs][0a6bbd6a4f1c0e1f61a82480deb9812aca25fc4036a5a3706e31d72f0f190422] open: importing #<struct FileWatch::InodeStruct inode="201328376", maj=0, min=66305> => #<FileWatch::SincedbValue:0x4dd9b9b8 @last_changed_at=1613764600.921809, @path_in_sincedb="/var/log/elasticsearch/elastic-cluster.log", @watched_file=nil, @position=8421121>
[2021-02-19T16:26:22,509][TRACE][filewatch.sincedbcollection][eslogs][0a6bbd6a4f1c0e1f61a82480deb9812aca25fc4036a5a3706e31d72f0f190422] open: importing #<struct FileWatch::InodeStruct inode="201328371", maj=0, min=66305> => #<FileWatch::SincedbValue:0x2cef023a @last_changed_at=1613764593.480989, @path_in_sincedb="/var/log/elasticsearch/gc.log", @watched_file=nil, @position=0>
[2021-02-19T16:26:22,510][TRACE][filewatch.sincedbcollection][eslogs][0a6bbd6a4f1c0e1f61a82480deb9812aca25fc4036a5a3706e31d72f0f190422] open: importing #<struct FileWatch::InodeStruct inode="201328333", maj=0, min=66305> => #<FileWatch::SincedbValue:0x300b223e @last_changed_at=1613764593.479392, @path_in_sincedb="/var/log/elasticsearch/elastic-cluster_deprecation.log", @watched_file=nil, @position=0>
[2021-02-19T16:26:22,512][TRACE][filewatch.sincedbcollection][eslogs][0a6bbd6a4f1c0e1f61a82480deb9812aca25fc4036a5a3706e31d72f0f190422] open: count of keys read: 5
[2021-02-19T16:26:22,518][DEBUG][logstash.api.service ] [api-service] start
[2021-02-19T16:26:22,539][TRACE][filewatch.discoverer ][eslogs][0a6bbd6a4f1c0e1f61a82480deb9812aca25fc4036a5a3706e31d72f0f190422] discover_files {:count=>0}
[2021-02-19T16:26:22,641][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2021-02-19T16:26:23,564][DEBUG][filewatch.sincedbcollection][eslogs][0a6bbd6a4f1c0e1f61a82480deb9812aca25fc4036a5a3706e31d72f0f190422] writing sincedb (delta since last write = 1613769983)
[2021-02-19T16:26:23,569][TRACE][filewatch.sincedbcollection][eslogs][0a6bbd6a4f1c0e1f61a82480deb9812aca25fc4036a5a3706e31d72f0f190422] sincedb_write: /var/lib/logstash/plugins/inputs/file/.sincedb_5cd297ac93ac4edccc765c8fc26ed0c5 (time = 2021-02-19 16:26:23 -0500)
[2021-02-19T16:26:25,107][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2021-02-19T16:26:25,108][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2021-02-19T16:26:27,391][DEBUG][org.logstash.execution.PeriodicFlush][eslogs] Pushing flush onto pipeline.
[2021-02-19T16:26:30,119][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2021-02-19T16:26:30,127][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2021-02-19T16:26:32,391][DEBUG][org.logstash.execution.PeriodicFlush][eslogs] Pushing flush onto pipeline.
[2021-02-19T16:26:35,135][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
so it doesn't run with the service but it runs fine as a standalone..
logstash.service:
[Unit]
Description=logstash
[Service]
Type=simple
User=logstash
Group=logstash
# Load env vars from /etc/default/ and /etc/sysconfig/ if they exist.
# Prefixing the path with '-' makes it try to load, but if the file doesn't
# exist, it continues onward.
EnvironmentFile=-/etc/default/logstash
EnvironmentFile=-/etc/sysconfig/logstash
ExecStart=/usr/share/logstash/bin/logstash "--path.settings" "/etc/logstash"
Restart=always
WorkingDirectory=/
Nice=19
LimitNOFILE=16384
# When stopping, how long to wait before giving up and sending SIGKILL?
# Keep in mind that SIGKILL on a process can cause data loss.
TimeoutStopSec=infinity
[Install]
WantedBy=multi-user.target