I am designing an ELK stack with ElasticSearch, Logstash and Kibana installed on a dedicated server and Filebeats installed on all the machine generating log files. Sounds like a standard setup.
My question is how to configure that connection between Filebeats and Logstash to post-process logs in Logstash before sending them to ElasticSearch. Specifically, I have two types of log files, JSON-formatted logs from a NodeJS app and Nginx logs. I need a mechanism to distinguish between these two types of input in Logstash.
The examples I could find simply indicate that Logstash receives its inputs from Filebeats. But to convert Nginx files to JSON before insterting them to ElasticSearch I need to distinguish between Filebeats with Nginx logs and Filebeats with Node.js logs. How do I do that?
Or should I be processing logs (by converting them to JSON) in Filebeats before sending them to Logstash?