New Install ELK 6.1 with Filebeat 6.1 for Tomcat Catalina Logs

Hai,

I am a newbie to ELK Stack. I have tried to configure the ELK Stack in a test machine in our LAB environment and was not able to succeed in parsing Apache Tomcat Logs. It would be great help if any one could assist me in completing it.

My purpose is to have a centralized log monitoring solution. Upon searching over Internet I have found about ELK Stack. I have installed the ELK Stack successfully and had also configured to ship "/var/log/messages" through my ELK Stack. But my actual requirement was to configure it to process Tomcat logs which I understand would require multi-line configuration which i had failed to configure. My requirements are

  1. To have Tomcat logs filtered for valid TRACE, DEBUG, INFO, WARN, ERROR levels along with "/var/log/messages"
  2. How to centralize theses tomcat logs running under different servers with different instances under ELK Stack.

Please help me as I had been trying to get it working for the past one week.

I have gone through the official documentation for Logstash and Filebeat. But it seems referring it is making me much more complicated. Hence it would kindly request to help me in configuration ELK Stack for tracking multiple server Tomcat Catalina logs. I have configured ELK Stack version 6.1 with File Beat 6.1 in a Linux Environment (RedHat 7.4). Please find my current working configurations for "/var/log/messages"..

############# Filebeat.yml #######################

filebeat.prospectors:

  • type: log
    enabled: true
    paths:
    • /var/log/messages
      filebeat.config.modules:
      path: ${path.config}/modules.d/*.yml
      reload.enabled: false
      setup.template.settings:
      index.number_of_shards: 3
      setup.kibana:
      host: "192.168.3.226:5601"
      output.logstash:
      hosts: ["192.168.3.226:5044"]

###################################################

############# Logstash.conf #######################

input {
beats {
host => "192.168.3.226"
port => 5044
}
}
filter {
if [type] == "syslog" {
grok {
match => { "message" => "%{SYSLOGLINE}" }
}
date { match => [ "timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ] }
}
}
output {
elasticsearch {
hosts => ["192.168.3.226:9200"]
index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
}
stdout { codec => rubydebug }
}

###################################################

Thank You

FYI we’ve renamed ELK to the Elastic Stack, otherwise Beats and APM feel left out! :wink:

Are you seeing anything via the stdout you have defined?

As per the current configuration, I am able to get "/var/log/messages" listed in the kibana interface, as in my current configuration the filebeat is transferring the logs to logstash from where it is getting filtered and the output is being displayed.

But when trying to traverse the catalina.out log, i have noticed that it contains logs listed in multiline format and hence needs to use multiline keyword in filebeat to get the Tomcat ERROR log and grok_filter in logstash to get the logs listed in kibana.

But while attempting to do so I am not able to get the result. Moreover while trying to implement along with "/var/log/messages" also I get error.. Please help..:disappointed_relieved::disappointed_relieved:

For tomcat configuration I have edited my filebeat.yml and logstash_tomcat.conf file as follows. Now Iam receiving the log file in my kibana interface. But while traversing through my log files, it was found that Iam getting "_grokparsefailure" which is not in the logs. Also in the same log series for another entry Iam getting "_dateparsefailure". Please find the required configuration files ie. filebeat.yml, logstash_tomcat.conf, JSON output and the error snippet passed mentioned along.

Please help me if my configurations are incorrect as Iam new to ELK+Filebeat.

############# Filebeat.yml ###############

filebeat.prospectors:

  • type: log
    enabled: true
    paths:
    • /usr/local/tomcat/logs/catalina.out
      multiline.pattern: '^[[:space:]]'
      multiline.pattern: '^[[:space:]]+|^Caused by:'
      multiline.negate: false
      multiline.match: after
      document_type: tomcat_log
      filebeat.config.modules:
      path: ${path.config}/modules.d/*.yml
      reload.enabled: false
      setup.template.settings:
      index.number_of_shards: 3
      setup.kibana:
      host: "192.168.3.226:5601"
      output.logstash:
      hosts: ["192.168.3.226:5044"]

##########################################

############# logstash_tomcat.conf #######

input {
beats {
type => "tomcat_log"
host => "192.168.3.226"
port => 5044
}
}

filter {
if [type] == "tomcat_log" {
grok {

patterns_dir => "/etc/logstash/patterns"

match => [ "message", "(?m)%{TIMESTAMP_ISO8601:timestamp} %{LOGLEVEL:severity} %{GREEDYDATA:message}" ]
overwrite => [ "message" ]

}
date {
match => [ "timestamp", "yyyy-MM-dd HH:mm:ss,SSS" ]
remove_field => [ "timestamp" ]
}

}
}
output {
stdout { codec => rubydebug }
if [type] == "tomcat_log" {
elasticsearch {
manage_template => false
hosts => ["192.168.3.226:9200"]
}
}
}

##########################################
##################_grokparsefailure ######

{
"_index": "logstash-2018.01.01",
"_type": "doc",
"_id": "QnYPsWABQTfQ6qM1YymO",
"_version": 1,
"_score": null,
"_source": {
"source": "/usr/local/tomcat/logs/catalina.out",
"@version": "1",
"host": "kibana.ctax.dev",
"type": "tomcat_log",
"beat": {
"version": "6.1.1",
"name": "kibana.ctax.dev",
"hostname": "kibana.ctax.dev"
},
"offset": 29553175,
"message": "Hibernate: insert into gst_log.gstn_api_log (api_versn, encrpt_key, reqst_body, reqst_date, reqst_query_strng, reqst_type, reqst_url, reqst_year_month, rspns_sts, rspns_body, rspns_ek, rspns_hmac, reqst_rspns_id) VALUES (?, ?, ?::jsonb, ?::timestamptz, ?, ?, ?, ?::int, ?, ?::jsonb, ?, ?, ?::int)",
"prospector": {
"type": "log"
},
"tags": [
"beats_input_codec_plain_applied",
"_grokparsefailure"
],
"@timestamp": "2018-01-01T09:30:57.951Z"
},
"fields": {
"@timestamp": [
"2018-01-01T09:30:57.951Z"
]
},
"sort": [
1514799057951
]
}
##########################################

##################_dateparsefailure ######
{
"_index": "logstash-2018.01.01",
"_type": "doc",
"_id": "go8dsWABQTfQ6qM1d8us",
"_score": 1,
"_source": {
"source": "/usr/local/tomcat/logs/catalina.out",
"@version": "1",
"timestamp": "2017-10-13 16:00:00",
"beat": {
"version": "6.1.1",
"name": "kibana.ctax.dev",
"hostname": "kibana.ctax.dev"
},
"offset": 593030236,
"message": "TaskUtils$LoggingErrorHandler:95 - Unexpected error occurred in scheduled task.",
"type": "tomcat_log",
"host": "kibana.ctax.dev",
"severity": "ERROR",
"prospector": {
"type": "log"
},
"tags": [
"beats_input_codec_plain_applied",
"_dateparsefailure"
],
"@timestamp": "2018-01-01T09:46:25.418Z"
},
"fields": {
"@timestamp": [
"2018-01-01T09:46:25.418Z"
]
}
}
##########################################

############ Error-log snippet ###########

Hibernate: insert into gst_log.gstn_api_log (api_versn, encrpt_key, reqst_body, reqst_date, reqst_query_strng, reqst_type, reqst_url, reqst_year_month, rspns_sts, rspns_body, rspns_ek, rspns_hmac, reqst_rspns_id) VALUES (?, ?, ?::jsonb, ?::timestamptz, ?, ?, ?, ?::int, ?, ?::jsonb, ?, ?, ?::int)
2017-10-31 16:00:00 ERROR TaskUtils$LoggingErrorHandler:95 - Unexpected error occurred in scheduled task.
java.lang.Error: Unresolved compilation problem:
Unhandled exception type ParseException

    at nic.kerala.adsm.attendancesystem.configuration.SchedulerConfig.scheduleTaskUsingCronExpression(SchedulerConfig.java:49)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.springframework.scheduling.support.ScheduledMethodRunnable.run(ScheduledMethodRunnable.java:65)
    at org.springframework.scheduling.support.DelegatingErrorHandlingRunnable.run(DelegatingErrorHandlingRunnable.java:54)
    at org.springframework.scheduling.concurrent.ReschedulingRunnable.run(ReschedulingRunnable.java:81)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:748)

Hibernate: select nextval ('gst_log.gstn_api_log_reqst_rspns_id_seq')
Hibernate: insert into gst_log.gstn_api_log (api_versn, encrpt_key, reqst_body, reqst_date, reqst_query_strng, reqst_type, reqst_url, reqst_year_month, rspns_sts, rspns_body, rspns_ek, rspns_hmac, reqst_rspns_id) VALUES (?, ?, ?::jsonb, ?::timestamptz, ?, ?, ?, ?::int, ?, ?::jsonb, ?, ?, ?::int)
Hibernate: insert into gst_log.gstn_api_log (api_versn, encrpt_key, reqst_body, reqst_date, reqst_query_strng, reqst_type, reqst_url, reqst_year_month, rspns_sts, rspns_body, rspns_ek, rspns_hmac, reqst_rspns_id) VALUES (?, ?, ?::jsonb, ?::timestamptz, ?, ?, ?, ?::int, ?, ?::jsonb, ?, ?, ?::int)

##########################################

You're barking up the wrong tree with the Filebeat multiline configuration. The logic you're looking for is "unless the line begins with a timestamp, join with the next line". Follow the pattern at https://www.elastic.co/guide/en/beats/filebeat/current/_examples_of_multiline_configuration.html#_timestamps.

Once the multiline processing is done correctly you can start worrying about any grok and date parse failures.

Hai,

As per your link description I have updated my filebeat.yml file as follows to read the error-log file based on time. Please help me in clearing with "_grokparsefailure" and "_dateparsefailure" error which Iam still facing.

############ filebeat.yml ######################

filebeat.prospectors:

type: log
enabled: true
multiline.match: after
multiline.negate: true
multiline.pattern: '[0-9]{4}-[0-9]{2}-[0-9]{2} [0-9]{2}'
paths:
  - /usr/local/tomcat/logs/catalina.out
fields: {log_type: tomcat1_cat_log}
document_type: tomcat_log

filebeat.config.modules:
path: ${path.config}/modules.d/*.yml
reload.enabled: false
setup.template.settings:
index.number_of_shards: 3
setup.kibana:
host: "192.168.3.226:5601"
output.logstash:
hosts: ["192.168.3.226:5044"]

################################################

Also, if I have missed anything, please help me correct my filebeat.yml file..

Please use a stdout { codec => rubydebug } output and show an example event produced by Logstash, plus your Logstash configuration.

Hai, magnusbaeck Plz find the output... it still contains the "_dateparsefailure"..

#############################################
{
"beat" => {
"hostname" => "kibana.ctax.dev",
"version" => "6.1.1",
"name" => "kibana.ctax.dev"
},
"source" => "/usr/local/tomcat/logs/catalina.out",
"type" => "tomcat_log",
"@timestamp" => 2018-01-04T10:57:19.283Z,
"severity" => "ERROR",
"message" => "LocalDataSourceJobStore:2867 - Error retrieving job, setting trigger state to ERROR.\norg.quartz.JobPersistenceException: Couldn't retrieve job because a required class was not found: nic.kerala.gst.scheduler.jobs.ledger.CashLedgerJob [See nested exception: java.lang.ClassNotFoundException: nic.kerala.gst.scheduler.jobs.ledger.CashLedgerJob]\n\tat org.quartz.impl.jdbcjobstore.JobStoreSupport.retrieveJob(JobStoreSupport.java:1393)\n\tat org.quartz.impl.jdbcjobstore.JobStoreSupport.acquireNextTrigger(JobStoreSupport.java:2864)\n\tat org.quartz.impl.jdbcjobstore.JobStoreSupport$41.execute(JobStoreSupport.java:2805)\n\tat org.quartz.impl.jdbcjobstore.JobStoreSupport$41.execute(JobStoreSupport.java:2803)\n\tat org.quartz.impl.jdbcjobstore.JobStoreSupport.executeInNonManagedTXLock(JobStoreSupport.java:3849)\n\tat org.quartz.impl.jdbcjobstore.JobStoreSupport.acquireNextTriggers(JobStoreSupport.java:2802)\n\tat org.quartz.core.QuartzSchedulerThread.run(QuartzSchedulerThread.java:287)\nCaused by: java.lang.ClassNotFoundException: nic.kerala.gst.scheduler.jobs.ledger.CashLedgerJob\n\tat org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1285)\n\tat org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1119)\n\tat org.springframework.scheduling.quartz.ResourceLoaderClassLoadHelper.loadClass(ResourceLoaderClassLoadHelper.java:76)\n\tat org.springframework.scheduling.quartz.ResourceLoaderClassLoadHelper.loadClass(ResourceLoaderClassLoadHelper.java:81)\n\tat org.quartz.impl.jdbcjobstore.StdJDBCDelegate.selectJobDetail(StdJDBCDelegate.java:852)\n\tat org.quartz.impl.jdbcjobstore.JobStoreSupport.retrieveJob(JobStoreSupport.java:1390)\n\t... 6 more",
"offset" => 2192,
"host" => "kibana.ctax.dev",
"tags" => [
[0] "beats_input_codec_plain_applied",
[1] "_dateparsefailure"
],
"timestamp" => "2017-11-02 23:59:50",
"prospector" => {
"type" => "log"
},
"fields" => {
"log_type" => "tomcat1_cat_log"
},
"@version" => "1"
}

#############################################

############# logstash.yml #################

path.data: /var/lib/logstash
path.config: /etc/logstash/conf.d/*.conf
path.logs: /var/log/logstash

############################################

############# conf.d/logstash_tomcat.conf ####

input {
beats {
type => "tomcat_log"
host => "192.168.3.226"
port => 5044
}
}

filter {
if [type] == "tomcat_log" {
grok {
patterns_dir => "/etc/logstash/patterns"
match => [ "message", "(?m)%{TIMESTAMP_ISO8601:timestamp} %{LOGLEVEL:severity} %{GREEDYDATA:message}" ]
overwrite => [ "message" ]
}
date {
match => [ "timestamp", "yyyy-MM-dd HH:mm:ss,SSS" ]
remove_field => [ "timestamp" ]
}

}
}
output {
stdout { codec => rubydebug }
if [type] == "tomcat_log" {
elasticsearch {
manage_template => false
hosts => ["192.168.3.226:9200"]
}
}
}

############################################

Your date pattern doesn't match the actual timestamp in the timestamp field (drop the milliseconds).

Hai, magnusbaeck, Thanks for pinpointing timestamp format. This had solved by "_dateparsefailure" issue.

But while viewing the logs via Kibana UI, it was noticed that the logs are getting scrambled with respect to the log file. Please find my catalina.out image and Kibana UI enclosed along.

Please refer the lines mentioned belows in both the images.

ROOT has finished in 321 ms
docs has finished in 9 ms
examples has finished in 186 ms
host-manager has finished in 13 ms
manager has finished in 10 ms

Screen-shot Kibana UI

Screen-shot Catalina.out

Kibana doesn't preserve the original order. It orders events based on the timestamp, and with only second resolution that's not going to give you the right order.

Hai, magnusbaeck ... Now I have been reciveing the grokparsefailure error mentioned in my earlier post. Plz find the output that contains the "_grokparsefailure"..

############################################# ERROR SNIPPET #############

"@timestamp" => 2018-01-16T04:59:22.000Z,
"type" => "tomcat_log",
"severity" => "ERROR"
}
{
"message" => " Source field value: nic.kerala.gst.registration.entity.gstn.Gstp@22411420b4[professionalAddressDetails=nic.kerala.gst.registration.entity.gstn.Address@30sacee48[addressId=1289041,bldgNum=test house wwwwww po,floorNum=,bldgName=,strt=sswaruvikuzhy,locty=anwwickadu,dst=KLKWWOT,stateCode=3112,pinCode=68622503,lat=,lon=,adressType=PA,aplnType=RTTR1,entyType=RTTR1,cntry=,cntryCode=,documents=[nic.kerala.gst.registration.entity.gstn.DocUpld@61ac2b85d],resubmittedDocuments=],enrollmentDetails=nic.kerala.gst.registration.entity.gstn.GstpEnrmtDtls@10f28eb5d,graduationDetails=nic.kerala.gst.registration.entity.gstn.GstpGrdtnDtls@49d289d53,applicantDetails=nic.kerala.gst.registration.entity.gstn.GstpAplntsDtls@505d6969,oldGstPractnrDetails=,queryClrfnData=,scnQueries=,modfctnRegnApln=,addnlInfo=,draftId=3404234,documents=,arnDtlsId=405608,aplnType=RTTR1,arn=AA3201180050423,createdDate=2018-01-12 21:00:16.497735,modifiedDate=,regnDtls=nic.kerala.gst.registration.entity.gstn.RegnDtls@3f0ffd52[arnDtlsId=405608,aplnType=RTTR1,aplnState=PFV,modfdDate=2018-01-12,dueDate=,email=ert345@gmail.com,mob=8222210651,date=,pan=FEPP22227L,pt=,legalNameBsns=,atzdSgnryName=,submitDate=,dsrctCode=KLKOT,stateCode=3222,aplnSts=,gstPrctnrName=xxx wwww,panOrTan=,cpin=,provsnlId=],decln=nic.kerala.gst.registration.entity.gstn.Decln@72956d9c[arnDtlsId=405608,declnPlace=aruwwvikuzhy,declnDate=2018-01-12,signType=EVC,asDescn=,asNum=,declnName=,declnText=,verified=true,applicantName=rrrrrrrr,pan=FEPPS0rr937L],origHash=b2brrrrrdef7f2af98525fed8b4c209e1rr64edc4e7feadaa1rb515a1158bed9,lastActionDate=2018-01-28,isMigrated=,securityToken=32184444400003599TRN,gstin=,current=true,alertDtlsId=]\n Dest parent class: nic.kerala.gst.registration.vo.GstpEnrmtDrrrrtlsVO\n Dest field name: gstp\n Dest field type: nic.kerala.gst.registration.entity.gstn.Gstp\norg.hibernate.LazyInitializationException: failed to lazily initialize a collection of role: nic.kerala.gst.registration.entity.gstn.GstpAplyyntsDtls.resubmittedPhotos, could not initialize proxy - no Session\n\tat org.hibernate.collection.internal.AbstractPersistentCollection.throwLazyInitializationException(AbstractPersistentCollection.java:582)\n\tat org.hibernate.collection.internal.AbstractPersistentCollection.withTemporarySessionIfNeeded(AbstractPersistentCollection.java:201)\n\tat org.hibernate.collection.internal.AbstractPersistentCollection.readSize(AbstractPersistentCollection.java:145)\n\tat org.hibernate.collection.internal.PersistentBag.size(PersistentBag.java:261)\n\tat org.dozer.MappingProcessor.prepareDestinationList(MappingProcessor.java:837)\n\tat org.dozer.MappingProcessor.addOrUpdateToList(MappingProcessor.java:762)\n\tat org.dozer.MappingProcessor.addOrUpdateToList(MappingProcessor.java:850)\n\tat org.dozer.MappingProcessor.mapListToList(MappingProcessor.java:686)\n\tat org.dozer.MappingProcessor.mapCollection(MappingProcessor.java:541)\n\tat org.dozer.MappingProcessor.mapOrRecurseObject(MappingProcessor.java:434)\n\tat org.dozer.MappingProcessor.mapFromFieldMap(MappingProcessor.java:342)\n\tat org.dozer.MappingProcessor.mapField(MappingProcessor.java:288)\n\tat org.dozer.MappingProcessor.map(MappingProcessor.java:248)\n\tat org.dozer.MappingProcessor.map(MappingProcessor.java:197)\n\tat org.dozer.MappingProcessor.mapCustomObject(MappingProcessor.java:495)\n\tat org.dozer.MappingProcessor.mapOrRecurseObject(MappingProcessor.java:446)\n\tat org.dozer.MappingProcessor.mapFromFieldMap(MappingProcessor.java:342)\n\tat org.dozer.MappingProcessor.mapField(MappingProcessor.java:288)\n\tat org.dozer.MappingProcessor.map(MappingProcessor.java:248)\n\tat org.dozer.MappingProcessor.map(MappingProcessor.java:197)\n\tat org.dozer.MappingProcessor.mapCustomObject(MappingProcessor.java:495)\n\tat org.dozer.MappingProcessor.mapOrRecurseObject(MappingProcessor.java:446)\n\tat org.dozer.MappingProcessor.mapFromFieldMap(MappingProcessor.java:342)\n\tat org.dozer.MappingProcessor.mapField(MappingProcessor.java:288)\n\tat org.dozer.MappingProcessor.map(MappingProcessor.java:248)\n\tat org.dozer.MappingProcessor.map(MappingProcessor.java:197)\n\tat org.dozer.MappingProcessor.mapCustomObject(MappingProcessor.java:495)\n\tat o org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:342)\n\tat org.apache.coyote.ajp.AjpProcessor.service(AjpProcessor.java:486)\n\tat org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:66)\n\tat org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:861)\n\tat org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1455)\n\tat org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:49)\n\tat java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)\n\tat java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)\n\tat org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)\n\tat java.lang.Thread.run(Thread.java:748)",
"fields" => {
"log_type" => "tomcat1_cat_log"
},
"prospector" => {
"type" => "log"
},
"tags" => [
[0] "beats_input_codec_plain_applied",
[1] "_grokparsefailure"
],
"@version" => "1",
"host" => "kibana.ctax.dev",
"offset" => 967530,
"beat" => {
"name" => "kibana.ctax.dev",
"hostname" => "kibana.ctax.dev",
"version" => "6.1.1"
},
"source" => "/usr/local/tomcat/logs/catalina.out",
"@timestamp" => 2018-01-16T09:22:17.381Z,
"type" => "tomcat_log"
}

#############################################

*** The error enclosed along has been trimmed. The Complete error report has been enclosed along for reference

The logstash.yml, logstash-tomcat.conf and filebeat.yml configurations remains the same as above

Complete error-log part-1

Complete error-log part-2

Complete error-log part-3

Please find the modified grok-patterns file snippet shown below for traversing the tomcat catalina.out
log. I have modified the default grok-patterns file by adding additional Java Logs section to traverse tomcat/java stack trace log as shown above.

########## /etc/logstash/patterns/grok-patterns ##########
......

Log formats

SYSLOGBASE %{SYSLOGTIMESTAMP:timestamp} (?:%{SYSLOGFACILITY} )?%{SYSLOGHOST:logsource} %{SYSLOGPROG}:

Log Levels

LOGLEVEL ([Aa]lert|ALERT|[Tt]race|TRACE|[Dd]ebug|DEBUG|[Nn]otice|NOTICE|[Ii]nfo|INFO|[Ww]arn?(?:ing)?|WARN?(?:ING)?|[Ee]rr?(?:or)?|ERR?(?:OR)?|[Cc]rit?(?:ical)?|CRIT?(?:ICAL)?|[Ff]atal|FATAL|[Ss]evere|SEVERE|EMERG(?:ENCY)?|[Ee]merg(?:ency)?)

####### Modified by me ############

Java Logs

JAVATHREAD (?:[A-Z]{2}-Processor[\d]+)
JAVACLASS (?:[a-zA-Z0-9-]+.)+[A-Za-z0-9$]+
JAVAFILE (?:[A-Za-z0-9_.-]+)
JAVASTACKTRACEPART at %{JAVACLASS:class}.%{WORD:method}(%{JAVAFILE:file}:%{NUMBER:line})
JAVALOGMESSAGE (.*)

MMM dd, yyyy HH:mm:ss eg: Jan 9, 2014 7:13:13 AM

CATALINA_DATESTAMP %{MONTH} %{MONTHDAY}, 20%{YEAR} %{HOUR}:?%{MINUTE}(?::?%{SECOND}) (?:AM|PM)

yyyy-MM-dd HH:mm:ss,SSS ZZZ eg: 2014-01-09 17:32:25,527 -0800

TOMCAT_DATESTAMP 20%{YEAR}-%{MONTHNUM}-%{MONTHDAY} %{HOUR}:?%{MINUTE}(?::?%{SECOND}) %{ISO8601_TIMEZONE}
CATALINALOG %{CATALINA_DATESTAMP:timestamp} %{JAVACLASS:class} %{JAVALOGMESSAGE:logmessage}

2014-01-09 20:03:28,269 -0800 | ERROR | com.example.service.ExampleService - something compeletely unexpected happened...

TOMCATLOG %{TOMCAT_DATESTAMP:timestamp} | %{LOGLEVEL:level} | %{JAVACLASS:class} - %{JAVALOGMESSAGE:logmessage}

#################################################

any update ..please

This topic was automatically closed after 28 days. New replies are no longer allowed.