so all servers are linux, ubuntu 20.04
2 x es nodes
1x kibana
1x logstash
all have had cis hardening running on them
the most recent from logstash-plain.log
[2021-11-02T12:19:50,707][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2021-11-02T12:19:54,995][DEBUG][logstash.config.source.multilocal] Reading pipeline configurations from YAML {:location=>"/etc/logstash/pipelines.yml"}
[2021-11-02T12:19:54,999][DEBUG][logstash.config.source.local.configpathloader] Skipping the following files while reading config since they don't match the specified glob pattern {:files=>["/etc/logstash/conf.d/metricbeat/2-wds-metricbeat-filter.conf.old", "/etc/logstash/conf.d/metricbeat/3-wds-metricbeat-output.conf.old", "/etc/logstash/conf.d/metricbeat/98-fail-filter.conf.old", "/etc/logstash/conf.d/metricbeat/99-fail-output.conf.old", "/etc/logstash/conf.d/metricbeat/test.conf.old"]}
[2021-11-02T12:19:55,000][DEBUG][logstash.config.source.local.configpathloader] Reading config file {:config_file=>"/etc/logstash/conf.d/metricbeat/1-wds-metricbeat.conf"}
[2021-11-02T12:19:55,000][DEBUG][org.logstash.config.ir.PipelineConfig] -------- Logstash Config ---------
[2021-11-02T12:19:55,001][DEBUG][org.logstash.config.ir.PipelineConfig] Config from source, source: LogStash::Config::Source::MultiLocal, pipeline_id:: metricbeat
[2021-11-02T12:19:55,001][DEBUG][org.logstash.config.ir.PipelineConfig] Config string, protocol: file, id: /etc/logstash/conf.d/metricbeat/1-wds-metricbeat.conf
[2021-11-02T12:19:55,001][DEBUG][org.logstash.config.ir.PipelineConfig]
input {
beats {
port => 2598
type => "wds-metricbeat-input"
}
}
filter {
}
output {
if [type] == "wds-metricbeat-input" {
elasticsearch {
hosts => "http://10.0.60.60:9200"
user => logstash_system
password => 6EnArfBZ6OZtL2ncpkHQ
index => "ecs-metricbeat-%{+YYYY.MM.dd}"
}
}
else {
elasticsearch {
hosts => "http://10.0.60.60:9200"
user => logstash_system
password => 6EnArfBZ6OZtL2ncpkHQ
index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
}
}
}
[2021-11-02T12:19:55,001][DEBUG][org.logstash.config.ir.PipelineConfig] Merged config
[2021-11-02T12:19:55,001][DEBUG][org.logstash.config.ir.PipelineConfig]
input {
beats {
port => 2598
type => "wds-metricbeat-input"
}
}
filter {
}
output {
if [type] == "wds-metricbeat-input" {
elasticsearch {
hosts => "http://10.0.60.60:9200"
user => logstash_system
password => 6EnArfBZ6OZtL2ncpkHQ
index => "ecs-metricbeat-%{+YYYY.MM.dd}"
}
}
else {
elasticsearch {
hosts => "http://10.0.60.60:9200"
user => logstash_system
password => 6EnArfBZ6OZtL2ncpkHQ
index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
}
}
}
[2021-11-02T12:19:55,001][DEBUG][logstash.agent ] Converging pipelines state {:actions_count=>1}
[2021-11-02T12:19:55,003][DEBUG][logstash.agent ] Executing action {:action=>LogStash::PipelineAction::Create/pipeline_id:metricbeat}
[2021-11-02T12:19:55,014][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:metricbeat, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of [ \\t\\r\\n], \"#\", \"{\" at line 17, column 11 (byte 308) after output {\n if [type] == \"wds-metricbeat-input\" {\n elasticsearch {\n hosts => \"http://10.0.60.60:9200\"\n user => logstash_system\n password => 6EnArfBZ6OZtL2ncpkHQ\n ", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:32:in `compile_imperative'", "org/logstash/execution/AbstractPipelineExt.java:187:in `initialize'", "org/logstash/execution/JavaBasePipelineExt.java:72:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:47:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:52:in `execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:391:in `block in converge_state'"]}
[2021-11-02T12:19:55,711][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2021-11-02T12:19:55,711][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2021-11-02T12:20:00,716][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2021-11-02T12:20:00,718][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2021-11-02T12:20:05,721][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2021-11-02T12:20:05,722][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2021-11-02T12:20:10,725][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2021-11-02T12:20:10,725][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
and the pipeline_metricbeat.log also from last night
[2021-11-01T16:44:23,305][DEBUG][org.logstash.beats.ConnectionHandler] 170d2444: batches pending: true
[2021-11-01T16:44:23,309][DEBUG][org.logstash.beats.BeatsHandler] [local: 127.0.0.1:2598, remote: 127.0.0.1:49358] Received a new payload
[2021-11-01T16:44:23,309][DEBUG][org.logstash.beats.BeatsHandler] [local: 127.0.0.1:2598, remote: 127.0.0.1:49358] Sending a new message for the listener,
sequence: 17
[2021-11-01T16:44:23,325][DEBUG][org.logstash.beats.BeatsHandler] [local: 127.0.0.1:2598, remote: 127.0.0.1:49358] Sending a new message for the listener, sequence: 18
[2021-11-01T16:44:23,327][DEBUG][org.logstash.beats.BeatsHandler] [local: 127.0.0.1:2598, remote: 127.0.0.1:49358] Sending a new message for the listener, sequence: 19
[2021-11-01T16:44:23,328][DEBUG][org.logstash.beats.BeatsHandler] [local: 127.0.0.1:2598, remote: 127.0.0.1:49358] Sending a new message for the listener, sequence: 20
[2021-11-01T16:44:23,329][DEBUG][org.logstash.beats.BeatsHandler] [local: 127.0.0.1:2598, remote: 127.0.0.1:49358] Sending a new message for the listener, sequence: 21
[2021-11-01T16:44:23,330][DEBUG][org.logstash.beats.BeatsHandler] [local: 127.0.0.1:2598, remote: 127.0.0.1:49358] Sending a new message for the listener, sequence: 22
[2021-11-01T16:44:23,331][DEBUG][org.logstash.beats.BeatsHandler] [local: 127.0.0.1:2598, remote: 127.0.0.1:49358] Sending a new message for the listener, sequence: 23
[2021-11-01T16:44:23,332][DEBUG][org.logstash.beats.BeatsHandler] [local: 127.0.0.1:2598, remote: 127.0.0.1:49358] Sending a new message for the listener, sequence: 24
[2021-11-01T16:44:23,333][DEBUG][org.logstash.beats.BeatsHandler] 170d2444: batches pending: false
[2021-11-01T16:44:23,443][DEBUG][logstash.outputs.file ] File, writing event to file. {:filename=>"/tmp/test.txt"}
[2021-11-01T16:44:23,444][DEBUG][logstash.outputs.file ] File, writing event to file. {:filename=>"/tmp/test.txt"}
[2021-11-01T16:44:23,444][DEBUG][logstash.outputs.file ] File, writing event to file. {:filename=>"/tmp/test.txt"}
[2021-11-01T16:44:23,444][DEBUG][logstash.outputs.file ] File, writing event to file. {:filename=>"/tmp/test.txt"}
[2021-11-01T16:44:23,444][DEBUG][logstash.outputs.file ] File, writing event to file. {:filename=>"/tmp/test.txt"}
[2021-11-01T16:44:23,444][DEBUG][logstash.outputs.file ] File, writing event to file. {:filename=>"/tmp/test.txt"}
[2021-11-01T16:44:23,444][DEBUG][logstash.outputs.file ] File, writing event to file. {:filename=>"/tmp/test.txt"}
[2021-11-01T16:44:23,446][DEBUG][logstash.outputs.file ] File, writing event to file. {:filename=>"/tmp/test.txt"}
[2021-11-01T16:44:24,041][DEBUG][logstash.outputs.file ] Starting flush cycle
[2021-11-01T16:44:24,041][DEBUG][logstash.outputs.file ] Flushing file {:path=>"/tmp/test.txt", :fd=>#<IOWriter:0x62c4c829 @active=true, @io=#<File:/tmp/test.txt>>}
[2021-11-01T16:44:32,047][DEBUG][logstash.outputs.file ] Starting flush cycle
[2021-11-01T16:44:32,047][DEBUG][logstash.outputs.file ] Flushing file {:path=>"/tmp/test.txt", :fd=>#<IOWriter:0x62c4c829 @active=true, @io=#<File:/tmp/test.txt>>}
[2021-11-01T16:44:32,686][DEBUG][org.logstash.execution.PeriodicFlush] Pushing flush onto pipeline.
[2021-11-01T16:44:34,048][DEBUG][logstash.outputs.file ] Starting flush cycle
[2021-11-01T16:44:34,049][DEBUG][logstash.outputs.file ] Flushing file {:path=>"/tmp/test.txt", :fd=>#<IOWriter:0x62c4c829 @active=true, @io=#<File:/tmp/test.txt>>}
[2021-11-01T16:44:36,050][DEBUG][logstash.outputs.file ] Starting flush cycle
[2021-11-01T16:44:36,050][DEBUG][logstash.outputs.file ] Flushing file {:path=>"/tmp/test.txt", :fd=>#<IOWriter:0x62c4c829 @active=true, @io=#<File:/tmp/test.txt>>}
[2021-11-01T16:44:37,686][DEBUG][org.logstash.execution.PeriodicFlush] Pushing flush onto pipeline.
[2021-11-01T16:44:37,712][DEBUG][logstash.outputs.file ] Starting stale files cleanup cycle {:files=>{"/tmp/test.txt"=>#<IOWriter:0x62c4c829 @active=true, @io=#<File:/tmp/test.txt>>}}
[2021-11-01T16:44:40,053][DEBUG][logstash.outputs.file ] Starting flush cycle
[2021-11-01T16:44:40,053][DEBUG][logstash.outputs.file ] Flushing file {:path=>"/tmp/test.txt", :fd=>#<IOWriter:0x62c4c829 @active=false, @io=#<File:/tmp/test.txt>>}
[2021-11-01T16:44:42,054][DEBUG][logstash.outputs.file ] Starting flush cycle
[2021-11-01T16:44:42,054][DEBUG][logstash.outputs.file ] Flushing file {:path=>"/tmp/test.txt", :fd=>#<IOWriter:0x62c4c829 @active=false, @io=#<File:/tmp/test.txt>>}
[2021-11-01T16:44:47,714][INFO ][logstash.outputs.file ] Closing file /tmp/test.txt
[2021-11-01T16:44:57,686][DEBUG][org.logstash.execution.PeriodicFlush] Pushing flush onto pipeline.
[2021-11-01T16:44:58,061][DEBUG][logstash.outputs.file ] Starting flush cycle
[2021-11-01T16:45:00,061][DEBUG][logstash.outputs.file ] Starting flush cycle
[2021-11-01T16:45:01,449][DEBUG][io.netty.buffer.PoolThreadCache] Freed 3 thread-local buffer(s) from thread: nioEventLoopGroup-2-3
[2021-11-01T16:45:01,449][DEBUG][io.netty.buffer.PoolThreadCache] Freed 3 thread-local buffer(s) from thread: nioEventLoopGroup-2-2
[2021-11-01T16:45:02,062][DEBUG][logstash.outputs.file ] Starting flush cycle
[2021-11-01T16:45:02,686][DEBUG][org.logstash.execution.PeriodicFlush] Pushing flush onto pipeline.
[2021-11-01T16:45:02,733][DEBUG][logstash.outputs.file ] Starting stale files cleanup cycle {:files=>{}}
[2021-11-01T16:45:02,734][DEBUG][logstash.outputs.file ] 0 stale files found {:inactive_files=>{}}
[2021-11-01T16:45:04,063][DEBUG][logstash.outputs.file ] Starting flush cycle
[2021-11-01T16:45:05,474][DEBUG][io.netty.buffer.PoolThreadCache] Freed 20 thread-local buffer(s) from thread: defaultEventExecutorGroup-4-1
[2021-11-01T16:45:05,474][DEBUG][io.netty.buffer.PoolThreadCache] Freed 21 thread-local buffer(s) from thread: defaultEventExecutorGroup-4-2
[2021-11-01T16:45:06,066][DEBUG][logstash.outputs.file ] Starting flush cycle
[2021-11-01T16:45:07,494][DEBUG][logstash.inputs.beats ] Closing {:plugin=>"LogStash::Inputs::Beats"}
[2021-11-01T16:45:07,500][DEBUG][logstash.pluginmetadata ] Removing metadata for plugin 3b99335f787b18a5454d2c07e964f07005d724e6a381adbba4fa7a9b127b2430
[2021-11-01T16:45:07,505][DEBUG][logstash.javapipeline ] Input plugins stopped! Will shutdown filter/output workers. {:pipeline_id=>"metricbeat", :thread=>"#<Thread:0x4ee1c9f3 run>"}
[2021-11-01T16:45:07,521][DEBUG][logstash.javapipeline ] Shutdown waiting for worker thread {:pipeline_id=>"metricbeat", :thread=>"#<Thread:0x5e4d2951 run>"}
[2021-11-01T16:45:07,611][DEBUG][logstash.outputs.file ] Closing {:plugin=>"LogStash::Outputs::File"}
[2021-11-01T16:45:08,071][DEBUG][logstash.outputs.file ] Close: closing files
[2021-11-01T16:45:08,072][DEBUG][logstash.pluginmetadata ] Removing metadata for plugin af32de41e63db12756a1d299870422f571f3368e645cc576585337a6345eb2a1
[2021-11-01T16:45:08,074][DEBUG][logstash.javapipeline ] Pipeline has been shutdown {:pipeline_id=>"metricbeat", :thread=>"#<Thread:0x4ee1c9f3 run>"}
[2021-11-01T16:45:08,076][INFO ][logstash.javapipeline ] Pipeline terminated {"pipeline.id"=>"metricbeat"}
ive also had a chance to test the same config on a fresh box with no hardening and only installing logstash and it does the same, spits out the same error.
so i went through logs on the ES box, nothing, no errors nothing about connectivity issues, nothing about failed connections or auths.
by doing a curl test, is that conclusive enough to say logstash should have no issues communicating with ES server? tcp/9200 is the only port that logstash uses to communicate and upload it traffic on ?