GROK Failure

please help , i have tried almost everything

here is my log and filebeat and logstash configs

OUT:999.777.666.172\tErrorCode=00000\tOpStatus=00000\tExceptionReason=null\tJSON={\"op_gsn\":\"\",\"op_status\":\"00000\",\"op_msg\":\"Processed\",\"flag\":\"Processed\"}","logger_name":"ACTIVITY","thread_name":"WebContainer : 2","level":"INFO","level_value":20000,"HOSTNAME":"","sessionId":"3aO92","IP":"","transactionID":"gatewayID","opcode":"opcode value","tags":null}


input {
   beats {
     port => 5044
     #codec => json

filter {     

   grok {
        match => { "message" => "{IP:OUT}%{BASE10NUM:ErrorCode}%{BASE10NUM:OpStatus}%{WORD:ExceptionReason}%data=%{GREEDYDATA:request}"}        
         match => { "message" => "{IP:OUT}%{BASE10NUM:ErrorCode}%{BASE10NUM:OpStatus}%{WORD:ExceptionReason}% data=%{GREEDYDATA:request}"}
         match => { "message" => "{IP:IN}%{BASE10NUM:AccessAccount}%{BASE10NUM:OPCODE}%{WORD:DeviceModel}%{WORD:DeviceManufacturer}%{WORD:DeviceId} %{WORD:DeviceOsName}%{WORD:DeviceOsVersion}%{WORD:AppVersion}%data=%{GREEDYDATA:request}"} 		
        match => { "message" => "%{IP:OUT}%{SPACE}%{DATA}%{INT:ErrorCode}%{SPACE}%{DATA}%{INT:OpStatus}%{NOTSPACE}%{DATA}%{WORD:ExceptionReason}  data=%{GREEDYDATA:request}"}

        source => "request"
        target => "log"
mutate {
    add_field => {
      "op_gsn" => "%{[JSON][op_gsn]}"
      "op_status" => "%{[JSON][op_status]}"
	  "typeId" => "%{[JSON][typeId]}"
      "logger_name" => "%{[JSON][logger_name]}"
	  "thread_name" => "%{[JSON][thread_name]}"
      "level" => "%{[JSON][level]}"
	  "level_value" => "%{[JSON][level_value]}"
      "HOSTNAME" => "%{[JSON][HOSTNAME]}"
	  "sessionId" => "%{[JSON][sessionId]}"
      "IP" => "%{[JSON][IP]}"
	  "transactionID" => "%{[JSON][transactionID]}"
      "opcode" => "%{[JSON][opcode]}"
	  "tags" => "%{[JSON][tags]}"
      "op_msg" => "{[JSON][op_msg]}"
      "flag" => "{[JSON][flag]}"

output {
        elasticsearch {
                hosts => ""
               # index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
               # document_type => "%{[@metadata][type]}"

###################### Filebeat Configuration Example #########################

# This file is an example configuration file highlighting only the most common
# options. The filebeat.full.yml file from the same directory contains all the
# supported options with more comments. You can use it as a reference.
# You can find the full configuration reference here:

#=========================== Filebeat prospectors =============================


# Each - is a prospector. Most options can be set at the prospector level, so
# you can use different prospectors for various configurations.
# Below are the prospector specific configurations.

#  - input_type: log

#      paths:
#      - `Preformatted text`/message.log

    input_type: log
   # json.add_error_key: true
    #json.keys_under_root: true
      - /opt/IBM/WAS/app/logs/bankingapp/message.log
    scan_frequency: 5s
    #backoff: 3s
    #multiline.pattern: ^\[
   # multiline.negate: true
    #multiline.match: after
    filebeat.registry_file: /var/lib/filebeat/registry

  #fields_under_root: true
  #  tages: ['json']
#----------------------------- Logstash output --------------------------------
  # The Logstash hosts
  hosts: [""]

  # Optional SSL. By default is off.
  # List of root certificates for HTTPS server verifications
  #ssl.certificate_authorities: ["/etc/pki/root/ca.pem"]

  # Certificate for SSL client authentication
  #ssl.certificate: "/etc/pki/client/cert.pem"

  # Client Certificate Key
  #ssl.key: "/etc/pki/client/cert.key"

There are lots of problems here and I don't think you understand how these kind of expressions work. How about using the grok constructor site to get help building an expression?

i have been using the site,  the pattern below doesnt match, please assist

message	ErrorCode=00002	OpStatus=00002	ExceptionReason=SYS_006	JSON={"op_gsn":"","op_status":"00002","op_msg":"An error occured in  App. Please call 0000000000 for help, quoting 00002."}


You need to be methodical about constructing grok expression. Start from the beginning and build it out by iterating. You must make sure that the grok expression matches the full line, not just the fields you want to extract.

Start with the following pattern: "OUT:%{NOTSPACE:out}\t%{GREEDYDATA:rest}"

Note that I have used NOTSPACE as 999.777.666.172 is not a valid IP address and will not match the IP expression. I have also added the literal OUT: to the start of the expression, as there will otherwise not be a match.

Now add one field at a time and test until you have built out the full expression.

[quote="Christian_Dahlqvist, post:4, topic:107270"]

Thank you , i had removed the actual IP address,

another issue i picked out is that my logs are not indexing to logstash
my log names are message.log but other logs index, what could be the problem?

I would recommend sorting out the grok patterns and processing logic using a stdout plugin with a rubydebug codec before worrying about whether data is indexed into Elasticsearch or not.

Do you have an example?

Use the grok constructor or the grok debugger. If you want to use Logstash directly, start with something like this:

input { stdin {} }

filter {
  grok {
    match => { "message" => ["OUT:%{NOTSPACE:out}\t%{GREEDYDATA:rest}"]}

output { stdout { codec => rubydebug } }

Once you have parsed events in the format you want, you can start working on mappings and how to index them into Elasticsearch.

Thank you will do that
but i cant see Grok tab, do i need to install seperatly

That is an X-Pack feature, and the process to install this is linked to in the documentation.

Thank you very much, i think is getting clear.
much appreciated.

I managed to install xpack for ES, KIbana, but when i install for Logstash i get the following error 
URI::InvalidURIError: bad URI(is not URI?): 
                          split at /usr/share/logstash/vendor/jruby/lib/ruby/1.9/uri/common.rb:176
                          parse at /usr/share/logstash/vendor/jruby/lib/ruby/1.9/uri/common.rb:210
                          parse at /usr/share/logstash/vendor/jruby/lib/ruby/1.9/uri/common.rb:747
                            URI at /usr/share/logstash/vendor/jruby/lib/ruby/1.9/uri/common.rb:994
  extract_proxy_values_from_uri at /usr/share/logstash/lib/pluginmanager/proxy_support.rb:47
                configure_proxy at /usr/share/logstash/lib/pluginmanager/proxy_support.rb:69
                         (root) at /usr/share/logstash/lib/pluginmanager/main.rb:26

managed to install, by removing proxy settings
export http_proxy=""

Thank you

I managed to get some progress with Grok, how do i access fields inside JSON?

You capture the entire JSON content in one field and then you apply a json filter to that field. You will need to make sure that JSON= is not part of that string though.

Thanks, how do i exclude JSON?
filter {
  json {
    source => "message"
    target => "JSON"

You need to match it in your grok expression to it does not appear in the captured data, similar to how I handled the initial OUT:.

should the JSON pattern be part of the top pattern, not sure how to do it
I need the data inside broken down as well/	ErrorCode=00002	OpStatus=00002	ExceptionReason=SYS_006	JSON={"op_gsn":"","op_status":"00002","op_msg":"An error occured in  App. Please call 0000000000 for help, quoting 00002."}


      "rest": "ExceptionReason=SYS_006\tJSON={\"op_gsn\":\"\",\"op_status\":\"00002\",\"op_msg\":\"An error occured in  App. Please call 0000000000 for help, quoting 00002.\"}",
      "errorCode": "00002",
      "opStatus": "00002",
      "out": ""

Don't use multiple DATA patterns like that. It's very inefficient. Try to find more exact patterns. NOTSPACE would e.g. work without being inefficient.

Two problems remain:

  • Extract the ExceptionReason field so it's not included in the rest field.
  • You don't want JSON= in the rest field.

Make sure the final data captured by the GREEDYDATA pattern just matches the full JSON document (not the initial JSON= part), then apply the JSON filter to this field, not on message.