Filebeat custom module is not parsing logs correctly

Hey I created a custom module in filebeat module directory for glassfish server logs, I could not use command

make create-fileset

to create a fileset in my windows, I was getting error. So I just updated apache2 folder in module directory and updated access&error folders as required for my purpose. However the actual log message pattern of glassfish is

"patterns": ["\\[#\\|%{TIMESTAMP_ISO8601:glassfish.server.timestamp}\\|%{LOGLEVEL:glassfish.server.loglevel}\\|%{DATA:glassfish.server.application}\\|%{GREEDYDATA:glassfish.server.component}\\|%{GLASSFISHTHREADS:glassfish.server.threadinfo}\\|%{GREEDYDATA:glassfish.server.message}\\|#\\]"]

I want to extract some fields from log message, see below 2 sample messages from glassfish server logs,

[#|2017-07-18T13:58:00.340-0400|INFO|glassfish3.1.2|javax.enterprise.system.std.com.sun.enterprise.server.logging|_ThreadID=21;_ThreadName=Thread-2;|ERROR 2d3539313438333230393333353734313934326c6f63616c686f737438303835 MemberRegistrationAction - 
org.app.common.horizontal.validations.ValidationException
at org.app.common.web.util.ServiceGateway.checkException(ServiceGateway.java:176)
at org.app.common.web.util.ServiceGateway.get(ServiceGateway.java:56)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.struts.actions.DispatchAction.dispatchMethod(DispatchAction.java:280)
at org.apache.struts.actions.LookupDispatchAction.execute(LookupDispatchAction.java:252)
at 
at com.sun.grizzly.util.AbstractThreadPool$Worker.doWork(AbstractThreadPool.java:532)
 at com.sun.grizzly.util.AbstractThreadPool$Worker.run(AbstractThreadPool.java:513)
at java.lang.Thread.run(Thread.java:662)
|#]

&

[#|2017-07-13T15:54:32.973-0400|INFO|glassfish3.1.2|javax.enterprise.system.std.com.sun.enterprise.server.logging|_ThreadID=27;_ThreadName=Thread-2;|2017 07 13 15:54:32: [Error]
 Source Class = org.app.common.horizontal.jndi.JndiObjectUtilities
 Source method = createInitialContext
 Application message = Missing value for the initial context factory class name.
|#]

so I created custom pipeline as below to extract fields such as glassfish.server.app_log_level, glassfish.server.app_correl_id...etc

"processors": [
{
"grok": {
"field": "message",
	"patterns": ["\\[#\\|%{TIMESTAMP_ISO8601:glassfish.server.timestamp}\\|%{LOGLEVEL:glassfish.server.loglevel}\\|%{DATA:glassfish.server.application}\\|%{GREEDYDATA:glassfish.server.component}\\|%{GLASSFISHTHREADS:glassfish.server.threadinfo}\\|%{LOGLEVEL:glassfish.server.app_log_level} %{GREEDYDATA:glassfish.server.app_correl_id} %{DATA:glassfish.server.app_class_name} - %{GREEDYMULTILINE:glassfish.server.app_message}\\|#\\]","\\[#\\|%{TIMESTAMP_ISO8601:glassfish.server.timestamp}\\|%{LOGLEVEL:glassfish.server.loglevel}\\|%{DATA:glassfish.server.application}\\|%{GREEDYDATA:glassfish.server.component}\\|%{GLASSFISHTHREADS:glassfish.server.threadinfo}\\|%{GREEDYDATA:glassfish.server.app_timestamp}: \\[%{LOGLEVEL:glassfish.server.app_log_level}\\]: \\[%{GREEDYDATA:glassfish.server.app_correl_id}\\] Source Class = %{GREEDYDATA:glassfish.server.app_class_name} %{GREEDYMULTILINE:glassfish.server.app_message}\\|#\\]"
    ],
	"ignore_missing": true,
"pattern_definitions": {
  "GLASSFISHTHREADS": "_ThreadID=%{NUMBER:glassfish.server.threadid};_ThreadName=Thread-%{NUMBER:glassfish.server.threadnumberinname};", "GREEDYMULTILINE" : "(.|\n)*"		   
} 
}
}

However, I have other pattern of logs from glassfish like below,

[#|2017-07-13T15:54:33.056-0400|SEVERE|glassfish3.1.2|javax.enterprise.system.tools.deployment.org.glassfish.deployment.common|_ThreadID=28;_ThreadName=Thread-2;|Exception while visiting com/sun/gjc/common/DataSourceSpec.class of size 3267
java.lang.NullPointerException
|#]

I am simply ignoring them using "ignore_missing": true in my pipeline.conf. I have a custom template and I am using below command to load glassfish modules

.\filebeat.exe -c filebeat.yml -e -v -modules=glassfish

When I am loading my custom dashboards to kibana, I am not getting values inserted into my custom fields. It is by default setting into "message" in filesbeat template.
It seems it is not correctly paring the fields, see below image, the data are not set in below fields.

A few things:

  • It sounds like you need to configure multiline in the module, to combine multiple lines together before parsing. Have you done that? You can find an example in the mysql module.
  • Rather than using regex |, it might be easier to declare multiple patterns. They will be attempted in order.
  • Did you use the simulate API to test the parsing?
  • The Kibana fields are automatically generated from the fields.yml of the module. Have you used that to declare the schema?

Seems like you hit multiple issues, perhaps we can take them one by one starting with what is affecting you the most?

Also, for my understanding, is this module general in the sense that it could be used by all Glassfish applications? Or it only works for yours?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.