Just poking around to see what the typical setup looks like.
I see a lot of people have a Logstash Server that they forward messages to. And that server in turn forwards data to Elasticsearch.
Is the Logstash Server resource intensive? I was thinking that Logstash would run on my application server. But it seems most people now have Filebeat to do the file moving. I do not have many servers. Should I run Logstash on my Elasticsearch server (I only have one for now) or standalone? Or on the app server?
Feeling a little confused and overwhelmed by the choices. Thanks.
It can be CPU intensive, depends on your grok filters really.
Start with it on the ES server, then if you are seeing high load, move it off
The plan as of now is to use Syslog to move the logs to the ELK server. Again, is this pretty normal?
I am writing the logs with the logback-logstash adapter so they will already be formatted so there isn't much groking to be done. But how will the syslog reader on the ELK machine know which log message is coming from which server/application? If I have 3 apps running on one server and they all log to syslog, how will I know which is which?
Sorry, it must be pretty clear that I have not actually used syslog before. I still have a lot of homework to do.
Probably best to use a port per application, then you can split it easier.