How to create Grok Filter Pattern for Tomcat logs

Jun 29, 2008 11:16:20 AM org.apache.catalina.core.ApplicationContext log
INFO: ContextListener: contextInitialized()
Jun 29, 2008 11:16:20 AM org.apache.catalina.core.ApplicationContext log
INFO: SessionListener: contextInitialized()
Jun 29, 2008 11:22:43 AM org.apache.catalina.core.StandardWrapperValve invoke
SEVERE: Servlet.service() for servlet jsp threw exception
org.apache.jasper.JasperException: /testmysql.jsp(3,4) Invalid directive
at org.apache.jasper.compiler.DefaultErrorHandler.jspError(
at org.apache.jasper.compiler.ErrorDispatcher.dispatch(
at org.apache.jasper.compiler.ErrorDispatcher.jspError(
at org.apache.jasper.compiler.Parser.parseDirective(
at org.apache.jasper.compiler.Parser.parseElements(
at org.apache.jasper.compiler.Parser.parse(
at org.apache.jasper.compiler.ParserController.doParse(
at org.apache.jasper.compiler.ParserController.parse(
at org.apache.jasper.compiler.Compiler.generateJava(
at org.apache.jasper.compiler.Compiler.compile(
at org.apache.jasper.compiler.Compiler.compile(
at org.apache.jasper.compiler.Compiler.compile(
at org.apache.jasper.JspCompilationContext.compile(
at org.apache.jasper.servlet.JspServletWrapper.service(
at org.apache.jasper.servlet.JspServlet.serviceJspFile(
at org.apache.jasper.servlet.JspServlet.service(
at javax.servlet.http.HttpServlet.service(
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(
at org.apache.catalina.core.ApplicationFilterChain.doFilter(
at org.apache.catalina.core.StandardWrapperValve.invoke(
at org.apache.catalina.core.StandardContextValve.invoke(
at org.apache.catalina.core.StandardHostValve.invoke(
at org.apache.catalina.valves.ErrorReportValve.invoke(
at org.apache.catalina.core.StandardEngineValve.invoke(
at org.apache.catalina.connector.CoyoteAdapter.service(
at org.apache.coyote.http11.Http11Processor.process(
at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(

My Tomcat logs are something like this .
I want to Know how to create Grok Filter Pattern for these type of logs

First of all, are you using a multiline codec (or Filebeat's multiline feature if you're using Filebeat to read the logs)?

If so, try using the grok constructor site to set up a grok expression.

@magnusbaeck thankyou for the reply
I have set up the filebeat and logstash configuration but i am facing the problem is that I am not able to query my result by timestamp as their are two time stamp (@timestamp and timestamp).

Pls help me


The presence of two timestamps isn't a problem in itself.

Use a date filter to parse timestamp into @timestamp, then delete timestamp.

@magnusbaeck thankyou for the reply
I applied the filter as my logstash config is

input {
beats {
port => 5044
filter {
grok {
match => { "message" => "%{TIMESTAMP_ISO8601:timestamp} [%{LOGLEVEL:level}]%{GREEDYDATA:messageText}%{IP:client}" }
date {
match => ["timestamp", "yyyy-MM-dd HH:mm:ss.SSS", "ISO8601"]
timezone => "UTC"
elasticsearch {
hosts => "localhost:9200"
index => "roha"

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.