I am trying to figure out if these timeouts are a bad thing or just notifications of no consequence.
From a cmd I can telnet to 10.10.98.102:5044 and the port opens. There is a cert there that I can see with a cURL.
Thoughts?
Thank you
2021-03-05T03:37:20.718-0400 INFO [monitoring] log/log.go:145 Non-zero metrics in the last 30s {"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":453,"time":{"ms":32}},"total":{"ticks":1515,"time":{"ms":94},"value":1515},"user":{"ticks":1062,"time":{"ms":62}}},"handles":{"open":461},"info":{"ephemeral_id":"aa16daa1-cdbd-4a7f-af88-32658cb8a3f1","uptime":{"ms":240097}},"memstats":{"gc_next":54539248,"memory_alloc":32704584,"memory_total":178850360,"rss":1101824},"runtime":{"goroutines":1685}},"filebeat":{"harvester":{"open_files":275,"running":272}},"libbeat":{"config":{"module":{"running":6}},"output":{"events":{"active":4096,"batches":2,"total":4096},"read":{"bytes":1381},"write":{"bytes":402195}},"pipeline":{"clients":6,"events":{"active":4121,"retry":4096}}},"registrar":{"states":{"current":531}}}}}
2021-03-05T03:37:36.214-0400 ERROR [logstash] logstash/async.go:280 Failed to publish events caused by: read tcp 10.10.92.13:51881->10.10.98.102:5044: i/o timeout
2021-03-05T03:37:36.214-0400 INFO [publisher] pipeline/retry.go:219 retryer: send unwait signal to consumer
2021-03-05T03:37:36.214-0400 INFO [publisher] pipeline/retry.go:223 done
2021-03-05T03:37:36.214-0400 ERROR [logstash] logstash/async.go:280 Failed to publish events caused by: read tcp 10.10.92.13:51881->10.10.98.102:5044: i/o timeout
2021-03-05T03:37:36.214-0400 INFO [publisher] pipeline/retry.go:219 retryer: send unwait signal to consumer
2021-03-05T03:37:36.214-0400 INFO [publisher] pipeline/retry.go:223 done
2021-03-05T03:37:36.246-0400 ERROR [logstash] logstash/async.go:280 Failed to publish events caused by: client is not connected
2021-03-05T03:37:36.246-0400 INFO [publisher] pipeline/retry.go:219 retryer: send unwait signal to consumer
2021-03-05T03:37:36.246-0400 INFO [publisher] pipeline/retry.go:223 done
2021-03-05T03:37:37.594-0400 ERROR [publisher_pipeline_output] pipeline/output.go:180 failed to publish events: client is not connected
2021-03-05T03:37:37.594-0400 INFO [publisher_pipeline_output] pipeline/output.go:143 Connecting to backoff(async(tcp://10.10.98.102:5044))
2021-03-05T03:37:37.594-0400 INFO [publisher] pipeline/retry.go:219 retryer: send unwait signal to consumer
2021-03-05T03:37:37.594-0400 INFO [publisher] pipeline/retry.go:223 done
2021-03-05T03:37:50.718-0400 INFO [monitoring] log/log.go:145 Non-zero metrics in the last 30s {"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":453},"total":{"ticks":1546,"time":{"ms":31},"value":1546},"user":{"ticks":1093,"time":{"ms":31}}},"handles":{"open":461},"info":{"ephemeral_id":"aa16daa1-cdbd-4a7f-af88-32658cb8a3f1","uptime":{"ms":270098}},"memstats":{"gc_next":54539248,"memory_alloc":35243472,"memory_total":181389248,"rss":512000},"runtime":{"goroutines":1684}},"filebeat":{"harvester":{"open_files":275,"running":272}},"libbeat":{"config":{"module":{"running":6},"scans":2},"output":{"events":{"active":-4096,"batches":1,"failed":6144,"total":2048},"read":{"errors":1},"write":{"bytes":272}},"pipeline":{"clients":6,"events":{"active":4121,"retry":4096}}},"registrar":{"states":{"current":531}}}}}
2021-03-05T03:38:10.587-0400 ERROR [publisher_pipeline_output] pipeline/output.go:154 Failed to connect to backoff(async(tcp://10.10.98.102:5044)): read tcp 10.10.92.13:51883->10.10.98.102:5044: i/o timeout
2021-03-05T03:38:10.587-0400 INFO [publisher_pipeline_output] pipeline/output.go:145 Attempting to reconnect to backoff(async(tcp://10.10.98.102:5044)) with 1 reconnect attempt(s)
2021-03-05T03:38:10.587-0400 INFO [publisher] pipeline/retry.go:219 retryer: send unwait signal to consumer
2021-03-05T03:38:10.587-0400 INFO [publisher] pipeline/retry.go:223 done