Read logs using the ELK stack from the remote server

I have successfully installed the Elastic stack(Elasticsearch + Logstash + Kibana) and can use it with Logback logs on the local machine. But now I need to read the logs from the remote server. I think there is no need to install entire ELK stack on the remote server, at least because ELK consumes some resources. Now I have only one remote server and my plan looks like this: I can setup the ELK stack on the my local machine and when I need to check the logs, I can connect the Logstash to the remote server and read the logs. Easily. But I have two questions:

  1. Is this a fine way to do this? I also thought about adding Filebeats to this pipeline, but I think I don't need it, because I will not have a constantly turned on computer that will constantly receive logs. I am looking for a simple solution for reading logs five to ten times a day.

  2. How can I implement this?

Logstash does not support reading logs from remote servers so I would recommend installing Filebeat on the remote hosts.

Ok, thanks. And what I need to do next if we take into account the requirements(I will not have a constantly turned on computer that will constantly receive logs)? Is it better to connect Filebeat to Elasticsearch or Logstash on the local machine?

I tried to use Filebeat, but it always tried to reconnect to Logstash("Attempting to reconnect to backoff"), I think it didn't think for such use. Or do I need to change some configuration for read logs on demand(let's call it like that)?

filebeat can send it directly to elasticsearch, if the processors in the filebeat are enough, you could skip the logstash

obviously, according to your requirements and scale, logstash should help with the load, and having 2 logstashs to forward mesasges to the elasticsearch cluster

Thanks, for your response. Can you explain in more detail your second part about the scheme with two Logstashes?

you can configure the output of your filebeat to send to more than one logstash according to the documentation

https://www.elastic.co/guide/en/beats/filebeat/current/logstash-output.html#loadbalance

loadbalance

If set to true and multiple Logstash hosts are configured, the output plugin load balances published events onto all Logstash hosts. If set to false, the output plugin sends all events to only one host (determined at random) and will switch to another host if the selected one becomes unresponsive. The default value is false.

output.logstash:
 hosts: ["localhost:5044", "localhost:5045"]
 loadbalance: true
 index: filebeat

But I don't need this. I want to produce logs on the remote server, and when I need to check the logs I want to connect to this server and read them using ELK on my local machine. Please check my first post.
So, how can I do this?

Logstash and Filebeat do not support reading logs from remote machines so you need to install Filebeat on the machine where the logs reside.

I already installed Filebeat on the machine where the logs reside, but it always tried to reconnect to Logstash("Attempting to reconnect to backoff") if Logstash is offline. I already said about this, please check it.

So, how can I configure ELK for this purpose? Or I just can't do this?

For Filebeat to be able to connect to Logstash they need to be up at the same time. If this is not possible another route might be to have Filebeat send data directly to Elasticsearch and use an ingest node pipeline to process it there.