Advice needed on making logs available to application developer

Background: We have number of large web applications deployed to WebLogic servers that are running on RedHat Linux 8. Development Teams need access to different logs (e.g. WebLogic, application logs) on these Linux machines. Since dev teams do not have access to Linux machines, we make logs available via a little homegrown java application that send the logs from each Linux machine into a separate Windows folder on a network drive. I am trying to replace this java app with Elastic stack. I have Elasticsearch cluster with three nodes. Farebeats are pushing the data from App machines to Elastic nodes via Logstash (no filtering yet). Kibana is up an running.

Questions:
1- How do I sperate data from different Linux machine so that each team only see their own data in Kibana.
2- I am having a very hard time setting up certs (not self-signed) on all ELK machines (ES nodes, Logstash, Kibana and Filebeat). If you know a video or tutorial on this topic, please share.

For your first question I can think of at least two possible ways to provide data to different teams. A lot depends on how important it is to keep one team from seeing data from another team's systems.

If it's not important, if the separation of data is all or mostly about reducing clutter, so a team can focus just on their event & log information, then you can build visualizations and dashboards that are simply based on a query for a team's IP addresses or hostnames.

Moving up in implementation complexity, but also moving more to limit access to other teams' data, would be to use Spaces

I don't know of any good videos or tutorials on setting up certificates - I learned the hard way half a decade ago - but I can offer a suggestion. Try a test cluster set with self-signed/auto-configured security. Make note of where the certificate files are located, what their ownership and permissions are, so you can use that information when building a cluster with "custom" certificates.

I am thinking that functionality of our little java app can be achieved just by Logstash. Problem with our app is that it is installed on each Linux machine where it picks up the logs. It slows down the machine. I have installed Logstash on a separate machine. If I can configure multiple pipelines where it reads logs from application machines and sent them to separate windows folders on a network drive then essentially it will be the same functionality as our application.

For collecting log information normally you'd have Filebeat installed on the system generating the log, but Logstash can also be configured to listen on a port. If your systems generating the logs of interest can send to that port, either instead of or in addition to saving to a log file, you can have Logstash collect the logs with essentially no footprint on the generating system. You can configure to listen for udp or tcp traffic. See Udp input plugin | Logstash Reference [8.9] | Elastic and Tcp input plugin | Logstash Reference [8.9] | Elastic for configuration details.

Logstash can output to a file, using the file output plugin - see File output plugin | Logstash Reference [8.9] | Elastic for details.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.