Logstash parsing not consistant

New here and new to ELK. I'm to a point however that I think it may be easier to ask as someone may have seen this or know the issue... two identical logs come into the system (file input type)... one line matches my filter, gets tags applied and runs thru grok. the other "should" match but simply bypasses the filter section... it's as if the rate is to fast for logstash to "filter".

here's my logstash config:

  input {
  file {
    path => ["/var/log/network.log"]
    start_position => "beginning"
    type => "syslog"
    tags => [ "netsyslog" ]
  }
}

# /opt/elk/logstash/patterns

filter {

   if [message] =~ "ASA" {
      mutate {
         add_tag => "ASA"
         add_tag => "Firewall"
         }



       grok {
          patterns_dir => "/opt/elk/logstash/patterns"
          match => [ "message", "%{TIMESTAMP_ISO8601:server_time}<%{NUMBER:syslognum}><%{WORD:syslogpri}>%{CISCOTIMESTAMP:cisco_time} %{SYSLOGHOST:hostname} : %{GREEDYDATA:asa_message}"]
          }



      }
   else {
      mutate {
         add_tag => "Other"
         }
      }
}

output {
    elasticsearch {
      protocol => "http"
      cluster => "elasticsearch"
      host => "localhost"
      template => "/opt/elk/logstash/templates/elasticsearch-template.json"
      template_overwrite => true
   }
}

The log messages are:

this one was parsed:
2015-10-06T15:51:59-04:00<166>Oct 06 2015 14:52:06 test-asa : %ASA-6-302013: Built outbound TCP connection 190884358 for corp:X.X.X.X/54321 (X.X.X.X/54321) to inside3:X.X.X.X/59884 (X.X.X.X/59884)

this one wasn't parsed:
2015-10-06T15:51:59-04:00<166>Oct 06 2015 14:52:06 test-asa : %ASA-6-302013: Built outbound TCP connection 190884360 for corp:X.X.X.X/54321 (X.X.X.X/54321) to inside3:X.X.X.X/59886 (X.X.X.X/59886)

the one that is parsed I see all the fields my grok parser is adding so that looks 100% good. the one that didn't get parsed doesn't have any tags or anything... like the match of "ASA" didn't catch for some reason... am I doing something wrong?

Hi,

I took your 2 log lines and neither matched your grok. Removing <%{WORD:syslogpri}> seems to fix the issue and it now parses both lines. Can you confirm you're using the right grok.

 match => [ "message", "%{TIMESTAMP_ISO8601:server_time}<%{NUMBER:syslognum}>%{CISCOTIMESTAMP:cisco_time} %{SYSLOGHOST:hostname} : %{DATA:asa_message}"]

ok... so my log entry is correct... but the post is removing the detail... if I try to post something on here that says < info > but without the spaces around the word the post simply drops it... not sure what that's about...

so the above should read: remove the extra space before and after the word info below and you'll have the actual log message...

this one was parsed:
2015-10-06T15:51:59-04:00<166>< info >Oct 06 2015 14:52:06 test-asa : %ASA-6-302013: Built outbound TCP connection 190884358 for corp:X.X.X.X/54321 (X.X.X.X/54321) to inside3:X.X.X.X/59884 (X.X.X.X/59884)

this one wasn't parsed:
2015-10-06T15:51:59-04:00<166>< info >Oct 06 2015 14:52:06 test-asa : %ASA-6-302013: Built outbound TCP connection 190884360 for corp:X.X.X.X/54321 (X.X.X.X/54321) to inside3:X.X.X.X/59886 (X.X.X.X/59886)


screenshot of kibana. similar messages. some were caught by the "ASA" and tags added, one wasn't... not sure why.. some were parsed correctly with grok.. some reported failure even though I can take the actual message and post it online with the grok debugger and it says it's good...

Hi,

Have you tried using the output rubydebug to see what the output looks like?

output {
   stdout { codec => rubydebug }
}

It may help see why the grok is failing. Also try enabling --verbose or --debug and check the logstash log file to see if there are any errors related to grok. Not sure about why the tag isn't added. But if its reproducible you may see some error in the rubydebug output or in the log file if you use --debug.

thanks for the suggestion. I actually updated to logstash 2.0.0-beta1 (was using 1.5.4) and now it's consistently parsing the messages. there were a few that still came thru with _grokparsefailure but that was only on firewalls that were't configured to send their hostname in the syslog message so it was fairly obvious what that was. I updated the firewall config to add the hostname in it's syslog message and now all my logs are getting parsed as expected.

thanks.