I have implemented ELK stack to centralize application logs. I'm newbie to ELK stack and was able to configure ELK stack up and running. Now my all application logs been uploaded to the kibana and I can see them on there. Now I need to index and create filters for this.
and here is my sample logs needs to be analyse.
06:29:13.825 [main] INFO o.a.coyote.http11.Http11NioProtocol - Initializing ProtocolHandler ["http-nio-8080"]
06:29:13.838 [main] INFO o.a.coyote.http11.Http11NioProtocol - Starting ProtocolHandler ["http-nio-8080"]
06:29:13.843 [main] INFO o.a.tomcat.util.net.NioSelectorPool - Using a shared selector for servlet write/read
06:29:13.861 [main] INFO o.s.b.c.e.t.TomcatEmbeddedServletContainer - Tomcat started on port(s): 8080 (http)
06:29:13.866 [main] INFO c.p.w.t.TransformationServiceApplication - Started TransformationServiceApplication in 23.291 seconds (JVM running for 2
- How can I index data according to the application name(ex; customer-service)
- How can I filter data, like according to the log level(INFO,DEBUG,spring logs,etc)
Note: every component of ELK stack deployed as docker containers.