We are looking for a solution that could help us centralize custom logs from different applications that are all part of a single system. The applications are all different in nature, some are .NET C# apps on IIS, some are services running in Microsoft Biztalk Server (in C#) and some could be Ruby on Rails sites running under Apache on Linux.
All of these applications call each other in sequence to transform and deliver messages, and all have a single point in common: a request id.
We need a way to be able to track the logs related to the same request-id across all the different apps of the system. In other words, we would like to be able to search Kibana for all logs from all apps where the request-id is equal to a specific ID, and be able to follow the flow easily.
We thought of using the ELK stack to do this, but that means that all of those apps would have to have a common log format, or at least, a common schema once Logstash ships them to Elasticsearch, or else, it would be impossible to search the multiple indices at once since the fields wouldn't match. Is my assessment correct?
Ideally, we would like to explore the machine learning feature and use visualisations in Kibana, and from what I understand, it would only work properly if all our data is in the same index pattern. Correct?