To grok or not to grok?


(John Swift) #1

Hi
Looking for some advice from the elastic community
I need to parse the following log entry into separate fields.
My initial thoughts were to try & write a grok pattern as the structure is the same for each log entry but wondering if there was a better way as the source logs are in XML format

<m n="-1" p="Unknown" c="-1" d="10:20:28.161" u="-1" t="LiveNode" i="-1.1" s="70">
org.springframework.ldap.CommunicationException: xxxxxxxx:636; nested exception is javax.naming.CommunicationException: xxxxxxxxxx:636 [Root exception is javax.net.ssl.SSLHandshakeException: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target]
	at org.springframework.ldap.support.LdapUtils.convertLdapException(LdapUtils.java:100)
	at org.springframework.ldap.core.support.AbstractContextSource.createContext(AbstractContextSource.java:266)
	at org.springframework.ldap.core.support.AbstractContextSource.getContext(AbstractContextSource.java:106)
	at org.springframework.ldap.core.support.AbstractContextSource.getReadWriteContext(AbstractContextSource.java:138)
</m>

(Steffen Siering) #2

Looks like you need to enable multiline support. Logstash also has an XML filter. Personally I'd always prefer a proper parser over grok (which is based on regular expression).