Grok Parsing failure

I am getting into Logstash files from Filebeat, grok them and insert into Elasticsearch.

As my log file contains various formats I created 6 different groks, all in the same "if" on the type of the input and in each grok I added a unique tag in "tag_on_failure".

The current problematic groks are on type crm_server_log.

if [type] == "crm_server_log"

When processing, I see that all the tags were added, from 1 to 6 and that document in the Elasticsearch doesn't contains the names that I gave for the fields.

I tried to use the grok debuggers but it looks my logs are too complicated for that.

Can I get an help here to create groks ?

*All links to data in github will be provided Privately in emails.

Thanks
Sharon.

Please show your configuration and an example input line.

My log files look like that:

####<Jan 26, 2017 1:45:21 AM CET> <Info> <Deployer> <nlup08hr.vfnl.dc-ratingen.de> <CRMServer-2> <[ACTIVE] ExecuteThread: '1' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <> <
1485391521501> <BEA-149209> <Resuming.>
####<Jan 26, 2017 1:45:22 AM CET> <Error> <Kernel> <nlup08hr.vfnl.dc-ratingen.de> <CRMServer-2> <[ACTIVE] ExecuteThread: '26' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <> <
1485391522429> <BEA-000802> <ExecuteRequest failed
 java.lang.RuntimeException: java.net.ConnectException: t3://nlup10hr:30101: Destination unreachable; nested exception is:
        java.net.ConnectException: Connection refused; No available router to destination.
java.lang.RuntimeException: java.net.ConnectException: t3://nlup10hr:30101: Destination unreachable; nested exception is:
        java.net.ConnectException: Connection refused; No available router to destination
        at weblogic.transaction.internal.ServerCoordinatorDescriptorManagerImpl$1.run(ServerCoordinatorDescriptorManagerImpl.java:757)
        at weblogic.work.ExecuteThread.execute(ExecuteThread.java:209)
        at weblogic.work.ExecuteThread.run(ExecuteThread.java:178)
Caused By: java.net.ConnectException: t3://nlup10hr:30101: Destination unreachable; nested exception is:
        java.net.ConnectException: Connection refused; No available router to destination
        at weblogic.rjvm.RJVMFinder.findOrCreateInternal(RJVMFinder.java:216)
        at weblogic.rjvm.RJVMFinder.findOrCreate(RJVMFinder.java:170)
        at weblogic.rjvm.ServerURL.findOrCreateRJVM(ServerURL.java:153)
        at weblogic.rjvm.ServerURL.findOrCreateRJVM(ServerURL.java:87)
        at weblogic.rjvm.RJVMManager.findOrCreateEndPoint(RJVMManager.java:462)
        at weblogic.rmi.spi.RMIRuntime.findOrCreateEndPoint(RMIRuntime.java:42)
        at weblogic.rmi.extensions.server.RemoteDomainSecurityHelper.isRemoteDomain(RemoteDomainSecurityHelper.java:386)
        at weblogic.rmi.extensions.server.RemoteDomainSecurityHelper.getSubject(RemoteDomainSecurityHelper.java:132)
        at weblogic.transaction.internal.PlatformHelperImpl.getRemoteSubject(PlatformHelperImpl.java:411)
        at weblogic.transaction.internal.SecureAction.runAction(SecureAction.java:35)
        at weblogic.transaction.internal.ServerCoordinatorDescriptorManagerImpl$1.run(ServerCoordinatorDescriptorManagerImpl.java:754)
        at weblogic.work.ExecuteThread.execute(ExecuteThread.java:209)
        at weblogic.work.ExecuteThread.run(ExecuteThread.java:178)
Caused By: java.rmi.ConnectException: Destination unreachable; nested exception is:
        java.net.ConnectException: Connection refused; No available router to destination
        at weblogic.rjvm.ConnectionManager.bootstrap(ConnectionManager.java:470)
        at weblogic.rjvm.ConnectionManager.bootstrap(ConnectionManager.java:321)
        at weblogic.rjvm.RJVMManager.findOrCreateRemoteInternal(RJVMManager.java:254)
        at weblogic.rjvm.RJVMManager.findOrCreate(RJVMManager.java:197)
        at weblogic.rjvm.RJVMFinder.findOrCreateRemoteServer(RJVMFinder.java:238)
        at weblogic.rjvm.RJVMFinder.findOrCreateInternal(RJVMFinder.java:200)
        at weblogic.rjvm.RJVMFinder.findOrCreate(RJVMFinder.java:170)
        at weblogic.rjvm.ServerURL.findOrCreateRJVM(ServerURL.java:153)
        at weblogic.rjvm.ServerURL.findOrCreateRJVM(ServerURL.java:87)
        at weblogic.rjvm.RJVMManager.findOrCreateEndPoint(RJVMManager.java:462)
        at weblogic.rmi.spi.RMIRuntime.findOrCreateEndPoint(RMIRuntime.java:42)
        at weblogic.rmi.extensions.server.RemoteDomainSecurityHelper.isRemoteDomain(RemoteDomainSecurityHelper.java:386)
        at weblogic.rmi.extensions.server.RemoteDomainSecurityHelper.getSubject(RemoteDomainSecurityHelper.java:132)
        at weblogic.transaction.internal.PlatformHelperImpl.getRemoteSubject(PlatformHelperImpl.java:411)
        at weblogic.transaction.internal.SecureAction.runAction(SecureAction.java:35)
        at weblogic.transaction.internal.ServerCoordinatorDescriptorManagerImpl$1.run(ServerCoordinatorDescriptorManagerImpl.java:754)
        at weblogic.work.ExecuteThread.execute(ExecuteThread.java:209)
        at weblogic.work.ExecuteThread.run(ExecuteThread.java:178)
>
####<Jan 26, 2017 1:45:18 AM CET> <Warning> <Management> <nlup08hr.vfnl.dc-ratingen.de> <CRMServer-2> <[ACTIVE] ExecuteThread: '1' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <>
 <> <1485391518724> <BEA-141277> <The JMX MBean PlatformMBeanServerUsed attribute is true, but the Platform MBeanServer was created without the hooks for the WLS security infrastructure. The Platform
MBeanServer will NOT be used and Platform MBeans will NOT be available via the WLS Runtime or Domain Runtime MBeanServers. This can occur if you have defined Platform MBeanServer system properties or
JVM options (-Dcom.sun.management.jmxremote or JRockit -XManagement).
 To allow the Platform MBeanServer to be used, you must either remove the system properties/JVM options or start WLS with the following system property:
 -Djavax.management.builder.initial=weblogic.management.jmx.mbeanserver.WLSMBeanServerBuilder
 If you want to eliminate this log error and do not need Platform MBeans to be available via WLS, then set the PlatformMBeanUsed attribute in the JMXMBean to false.>

Thanks
Sharon.

The groks I did in logstash.conf:

if [type] == "crm_server_log" {
        mutate {
                add_tag => [ "CRM_SERVER_LOGS" ]
                uppercase => [ "severity" ]
                gsub => [ "message", "r", "" ]
        }
        grok {
              tag_on_failure => [ "BROKEN_GROK_SYSLOG", "_grokparsefailure" , "crm_tag1" ]
              break_on_match => true
              keep_empty_captures => false
              match => { "message" => "####\<%{WEBLOGICTIMESTAMP_TZ:timestamp}\> \<%{LOGLEVEL:severity}\> \<%{DATA:module}\> \<%{DATA}\> \<%{DATA:application_server}\> \<%{DATA}\> <<anonymous>> <> <> \<%{NUMBER}\> \<%{DATA:error_code}\> \<%{DATA}\>" }
              patterns_dir => "/etc/logstash/patterns"
        }
        grok {
              tag_on_failure => [ "BROKEN_GROK_SYSLOG", "_grokparsefailure" , "crm_tag2"]
              break_on_match => true
              keep_empty_captures => false
              match => { "message" => "####\<%{WEBLOGICTIMESTAMP_TZ:timestamp}\> \<%{LOGLEVEL:severity}\> \<%{DATA:module}\> \<%{DATA}\> \<%{DATA:application_server}\> \<%{DATA}\> <<anonymous>> <> <> \<%{NUMBER}\> \<%{DATA:error_code}\> \<%{DATA}\.\>" }
              patterns_dir => "/etc/logstash/patterns"
        }
        grok {
              tag_on_failure => [ "BROKEN_GROK_SYSLOG", "_grokparsefailure" , "crm_tag3"]
              break_on_match => true
              keep_empty_captures => false
              match => { "message" => "####\<%{WEBLOGICTIMESTAMP_TZ:timestamp}\> \<%{LOGLEVEL:severity}\> \<%{DATA:message_type}\> \<%{DATA:server}\> \<%{DATA:log_name}\> \<\[%{DATA:log_desc}\> \<\<%{DATA:kernel}\>\> \<\> \<\> \<%{NUMBER}\> \<%{DATA:error_code}\> \<%{GREEDYDATA:error_message}\>" }
              patterns_dir => "/etc/logstash/patterns"
        }
        grok {
              tag_on_failure => [ "BROKEN_GROK_SYSLOG", "_grokparsefailure" , "crm_tag4"]
              break_on_match => true
              keep_empty_captures => false
              match => { "message" => "####\<%{WEBLOGICTIMESTAMP_TZ:timestamp}\> \<%{LOGLEVEL:severity}\> \<%{DATA:message_type}\> \<%{DATA:server}\> \<%{DATA:log_name}\> \<\[%{DATA:log_desc}\> \<\<%{DATA:kernel}\>\> \<\> \<\> \<%{NUMBER}\> \<%{DATA:error_code}\> \<%{GREEDYDATA:error_message}\.\>" }
              patterns_dir => "/etc/logstash/patterns"
        }
        grok {
              tag_on_failure => [ "BROKEN_GROK_SYSLOG", "_grokparsefailure" , "crm_tag5"]
              break_on_match => true
              keep_empty_captures => false
              match => { "message" => "####\<%{WEBLOGICTIMESTAMP_TZ:timestamp}\> \<%{LOGLEVEL:severity}\> \<%{DATA:message_type}\> \<%{DATA:server}\> \<%{DATA:log_name}\> \<\[%{DATA:log_desc}\> \<\<%{DATA:kernel}\>\> \<\> \<\> \<%{NUMBER}\> \<%{DATA:error_code}\> \<%{EXCEPTION:error_message}*?\>" }
              patterns_dir => "/etc/logstash/patterns"
        }
        grok {
              tag_on_failure => [ "BROKEN_GROK_SYSLOG", "_grokparsefailure" , "crm_tag6"]
              break_on_match => true
              keep_empty_captures => false
              match => { "message" => "####\<%{WEBLOGICTIMESTAMP_TZ:timestamp}\> \<%{LOGLEVEL:severity}\> \<%{GREEDYDATA:message_data}" }
              patterns_dir => "/etc/logstash/patterns"
        }
        grok {
              tag_on_failure => [ "BROKEN_GROK_SYSLOG", "_grokparsefailure" , "crm_tag7"]
              break_on_match => true
              keep_empty_captures => false
              match => { "message" => "####\<%{WEBLOGICTIMESTAMP_TZ:timestamp}\> \<%{LOGLEVEL:severity}\> \<%{EXCEPTION:message_exception}" }
              patterns_dir => "/etc/logstash/patterns"
        }
        date {
              match => [ "timestamp" , "MMM dd, yyyy hh:mm:ss aa Z" ]
        }
}

Some patterns I created and used:

WEBLOGICTIMESTAMP_TZ %{MONTH} %{MONTHDAY}, %{YEAR} %{TIME} %{DL} ?%{TZ}?
TZ (?:[IPMCE][SD]T|UTC|CET)
EXCEPTION ((.*\r(?:(.*Exception:(.*?);.*(\r.*)(\tat.*\r)+)))|((?:[a-zA-Z]*)).)
LOGLEVEL ([A-a]lert|ALERT|[T|t]race|TRACE|[D|d]ebug|DEBUG|[N|n]otice|NOTICE|[I|i]nfo|INFO|[W|w]arn?(?:ing)?|WARN?(?:ING)?|[E|e]rr?(?:or)?|ERR?(?:OR)?|[C|c]rit?(?:ical)?|CRIT?(?:ICAL)?|[F|f]atal|FATAL|[S|s]evere|SEVERE|EMERG(?:ENCY)?|[Ee]merg(?:ency)?)

Thanks
Sharon.

You want this instead:

grok {
  match => {
    message => [
      "####\<%{WEBLOGICTIMESTAMP_TZ:timestamp}\> \<%{LOGLEVEL:severity}\> \<%{DATA:module}\> \<%{DATA}\> \<%{DATA:application_server}\> \<%{DATA}\> <<anonymous>> <> <> \<%{NUMBER}\> \<%{DATA:error_code}\> \<%{DATA}\>",
      "####\<%{WEBLOGICTIMESTAMP_TZ:timestamp}\> \<%{LOGLEVEL:severity}\> \<%{DATA:module}\> \<%{DATA}\> \<%{DATA:application_server}\> \<%{DATA}\> <<anonymous>> <> <> \<%{NUMBER}\> \<%{DATA:error_code}\> \<%{DATA}\.\>",
      ...
    ]
  }
  ...
}

With your filters, you'll always get grok failures since every single filter will get applied and in most cases all but one will fail. The break_on_match option only applies to the expressions listed in the same filter.

Secondly, you really should reduce your use of DATA and GREEDYDATA. It's highly inefficient and could result in unexpected matches if you're not careful.

Great.
Thanks for the information. I will try and give a feedback.

Thanks,
Sharon.

Hi,
The change doesn't approve the result of the grok which is wrong.

  1. Still getting the _grokparsefailure annotation.
  2. The timestamp from the log doesn't go in the timestamp filed in the doc.
  3. I can't see in the kibana fields list all the names of the fields that I added as severity, module, etc....
    strong text
    In the kibana message you see the entry date is 26 Jan etc....:

    In the elasticsearch the timestamp is the current date:

    Still getting the error:

Thanks,
Sharon.

Then you have to debug your expressions. Identify which expression should've matched and strip it down to the smallest possible expression, e.g. ####<. Verify that that works without getting _grokparsefailure tags. Then gradually add token by token until things break again and you've identified the problem.

1 Like

I understand. Thanks. Will do it.

Seems my problem is with the time zone field int he timestamp.

How should I translate from local time zone to UTC (GMT) time?

More that, how to insert the timezone into the @timestamp ?

I tried z, Z , ZZZ but they doesn't work.

Thanks
Sharon.

So is your grok filter working now? If yes, what's the contents of your timestamp field (the one you're feeding to your date filter)? What does your date filter look like?

Hi @ssasporta

You should try to use the klv.

Greetings

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.