I was using the ELK stack for some time and integrated with most of application servers which are developed in java.
We are using log4j framework for logging and initially configured the log stash agent with input/filter/output.
For each java application we are using different pattern for logging, so the format of the log statements are different for each application.
So, for each application, i need to define the filter to identify the fields and output the data to elastic search.
With this approach, we need to maintain different filter configuration for each application and in most of the logs, the context is within message. ex: userName, transactionId, ... etc.
So when it comes to Kibana, I need to always do the full text search to identify the documents which contains the userName or transactionId or others context information.
One more disadvantage is, on each application server's log stash agent, for each log entry, we need to apply the filter which is costly w.r.t. CPU.
So I developed small java API wrapper around log4j, which will generate the logs in JSON format. Hence no filters required in the configuration.
Also, using the API, we can clearly separate the context from the log message, hence even filtering and searching also become very easy.
If any one interested using this utility please let me know.