Extract certain fields from Oracle listener log


(Mikael Sandström) #1

Hi,

First off, I'm new to logstash so I appreciate any help I can get with this problem.

I'm trying to get some information from the Oracle listener log into logstash but I'm not entirely sure how do that. Logstash reads the file just fine, and I can view the info in Kibana with no problems. But, I really just want certain parts of the information.
Below is a sample of the logfile. Lots of information but the only rows I'm really interested in are the ones that have the word 'established' in them.
Then for each of those lines I want a certain number of fields (from the highlighted example below):
DATE -> (12-OCT-2015 10:32:33)
SERVICE_NAME -> (asmdb)
PROGRAM -> (sqlplus@db1)
USER -> (oracle)
HOST -> (db1)

So basically, a grep for 'established' and then regexp the fields I want. I guess grok is the way to go but I cant really figure it out.

Mon Oct 12 10:31:09 2015
WARNING: Subscription for node down event still pending
12-OCT-2015 10:32:09 * (CONNECT_DATA=(CID=(PROGRAM=)(HOST=db1)(USER=oracle))(COMMAND=status)(ARGUMENTS=64)(SERVICE=LISTENER)(VERSION=186647552)) * status * 0
Mon Oct 12 10:32:32 2015
12-OCT-2015 10:32:33 * (CONNECT_DATA=(SERVER=DEDICATED)(SERVICE_NAME=asmdb)(CID=(PROGRAM=sqlplus@db1)(HOST=db1)(USER=oracle))) * (ADDRESS=(PROTOCOL=tcp)(HOST=127.0.0.1)(PORT=38999)) * establish * asmdb * 0
12-OCT-2015 10:32:33 * service_update * asmdb * 0
12-OCT-2015 10:32:36 * service_update * asmdb * 0

Any ideas?

Thanks in advance!
/M


(Mikael Sandström) #2

Hi again,

Forgot to mention I'm on Logstash 1.5.4

/M


(Magnus Bäck) #3

What have you tried so far?


(Mikael Sandström) #4

Hi,

Sorry for the late answer, got held up with other things yesterday

I actually got his working this morning (or at least sort of)

I have started writing a couple of patterns to extract the fields that I need:

SERVICE_NAME (?<=(SERVICE_NAME=)).*?(?=\))
PROGRAM (?<=(PROGRAM=)).*?(?=\))
HOST (?<=(HOST=)).*?(?=\))
USER (?<=(USER=)).*?(?=\))

And if I use this:

  grok {
      type => "oracle-listener-conn"
      match => {"message" => "%{NOTSPACE}%{SPACE} %{TIME} \*%{SPACE}%{NOTSPACE}%{SERVICE_NAME:service_name}%{NOTSPACE}%{PROGRAM:program}%{NOTSPACE}%{HOST:client}%{NOTSPACE}%{USER:user}" }

      patterns_dir => '/etc/logstash/patterns/'
    }

I get this:

    "message" => "12-OCT-2015 13:31:24 * (CONNECT_DATA=(SERVER=DEDICATED)(SERVICE_NAME=asmdb)(CID=(PROGRAM=sqlplus@db1)(HOST=db1)(USER=oracle))) * (ADDRESS=(PROTOCOL=tcp)(HOST=127.0.0.1)(PORT=43307)) * establish * asmdb * 0",
    "@version" => "1",
    "@timestamp" => "2015-10-12T11:31:24.757Z",
      "host" => "db1",
      "path" => "/u01/app/oracle/diag/tnslsnr/db1/listener/trace/listener.log",
      "type" => "oracle-listener-conn",
      "service_name" => "asmdb",
      "program" => "sqlplus@db1",
      "client" => "db1",
      "user" => "oracle"

Which is what I want, so that's good.

However, the order of the fields I want may shift around and some of them may also be empty, and in that case I'll miss those.
Is there a way to write the match part more generic, or do (can?) I have to write a separate line for every possible scenario?

Thanks!
regards
/M


(Magnus Bäck) #5

You might be able to use the kv filter, but the enclosing parentheses might make it difficult (the trim and trimkey should be useful). Otherwise a ruby filter will definitely allow you to write a generic extractor of key/value pairs.


(Mikael Sandström) #6

Cool, I'll have a look at that.

Thank you!

/M


(Ravi) #7

I tried your pattern to extract the fields but I am getting compiler error or not match (_grokfailure). Below is the sample records in oracle listener log. I am looking for all the fields in assignment.

30-APR-2014 08:46:00 * (CONNECT_DATA=(CID=(PROGRAM=)(HOST=sdp04)(USER=oracle))(COMMAND=status)(ARGUMENTS=64)(SERVICE=(ADDRESS=(PROTOCOL=TCP)(HOST=west4)(PORT=1521)))(VERSION=186647040)) * status * 0
Wed Apr 30 08:46:51 2014
30-APR-2014 08:46:51 * (CONNECT_DATA=(CID=(PROGRAM=)(HOST=sdp04)(USER=oracle))(COMMAND=status)(ARGUMENTS=64)(SERVICE=LISTENER_WFLOWPRD)(VERSION=186647040)) * status * 0
Wed Apr 30 08:47:51 2014
30-APR-2014 08:47:51 * (CONNECT_DATA=(CID=(PROGRAM=)(HOST=sdp04)(USER=oracle))(COMMAND=status)(ARGUMENTS=64)(SERVICE=LISTENER_WFLOWPRD)(VERSION=186647040)) * status * 0
Wed Apr 30 08:48:51 2014
30-APR-2014 08:48:51 * (CONNECT_DATA=(CID=(PROGRAM=)(HOST=sdp04)(USER=oracle))(COMMAND=status)(ARGUMENTS=64)(SERVICE=LISTENER_WFLOWPRD)(VERSION=186647040)) * status * 0
Wed Apr 30 08:49:51 2014
30-APR-2014 08:49:51 * (CONNECT_DATA=(CID=(PROGRAM=)(HOST=sdp04)(USER=oracle))(COMMAND=status)(ARGUMENTS=64)(SERVICE=east4)(VERSION=186647040)) * status * 0


(Ravi) #8

Magnus,

I tried your pattern to extract the fields but I am getting compiler error or not match (_grokfailure). Below is the sample records in oracle listener log. I am looking for all the fields in assignment. I need the data after that status.

30-APR-2014 08:46:00 *
(CONNECT_DATA=(CID=(PROGRAM=)(HOST=sdp04)(USER=oracle))(COMMAND=status)(ARGUMENTS=64)(SERVICE=(ADDRESS=(PROTOCOL=TCP)(HOST=west4)(PORT=1521)))(VERSION=186647040)) * status * 0
Wed Apr 30 08:46:51 2014
30-APR-2014 08:46:51 * (CONNECT_DATA=(CID=(PROGRAM=)(HOST=sdp04)(USER=oracle))(COMMAND=status)(ARGUMENTS=64)(SERVICE=LISTENER_WFLOWPRD)(VERSION=186647040)) * status * 0
Wed Apr 30 08:47:51 2014
30-APR-2014 08:47:51 * (CONNECT_DATA=(CID=(PROGRAM=)(HOST=sdp04)(USER=oracle))(COMMAND=status)(ARGUMENTS=64)(SERVICE=LISTENER_WFLOWPRD)(VERSION=186647040)) * status * 0
Wed Apr 30 08:48:51 2014
30-APR-2014 08:48:51 * (CONNECT_DATA=(CID=(PROGRAM=)(HOST=sdp04)(USER=oracle))(COMMAND=status)(ARGUMENTS=64)(SERVICE=LISTENER_WFLOWPRD)(VERSION=186647040)) * status * 0
Wed Apr 30 08:49:51 2014
30-APR-2014 08:49:51 * (CONNECT_DATA=(CID=(PROGRAM=)(HOST=sdp04)(USER=oracle))(COMMAND=status)(ARGUMENTS=64)(SERVICE=LISTENER_WFLOWPRD)(VERSION=186647040)) * status * 0


(Tony Aponte) #9

Are you referring to the different types of listener log event lines (i.e. connection requests, status, tnspings, db registration events), each with a different number of fields? I ran into the same thing and ended up using conditionals. Although my example is sourcing from the XML version of the logs the conditional statements may still apply to your issue:

filter {
if "listner_log" in [tags] {
multiline {
pattern => "^((?!).)*$"
what => "next"
negate => false
}
xml {
store_xml => "false"
source => "message"
target => "doc"
xpath => [
"/msg/@client_id", "msg_client_id",
"/msg/@comp_id", "msg_comp_id",
"/msg/@host_id", "msg_host_id",
"/msg/@host_addr", "msg_host_addr",
"/msg/@level", "msg_level",
"/msg/@type", "msg_type",
"/msg/@time", "msg_time",
"/msg/txt/text()","msg_txt"
]
}
date {
match => [ "msg_time", "ISO8601" ]
}
if [message] =~ /CONNECT_DATA/ {
grok {
match => [ "msg_txt", "%{URIHOST} %{TIME} * %{DATA:uri} * %{DATA:address} * %{DATA:action} * %{DATA:service_name} * %{GREEDYDATA:result}" ]
}
grok { match => [ "uri", "(SERVICE_NAME=%{DATA:uri_service_name})" ] }
grok { match => [ "uri", "(COMMAND=%{DATA:uri_command})" ] }
grok { match => [ "uri", "(PROGRAM=%{DATA:uri_program})" ] }
grok { match => [ "uri", "(HOST=%{DATA:uri_srchost})" ] }
grok { match => [ "uri", "(USER=%{DATA:uri_user})" ] }
grok { match => [ "uri", "(ARGUMENTS=%{DATA:uri_arguments})" ] }
grok { match => [ "uri", "(SERVICE=%{DATA:uri_service})" ] }
grok { match => [ "uri", "(VERSION=%{DATA:uri_version})" ] }
grok { match => [ "address", "(PROTOCOL=%{DATA:address_protocol})" ] }
grok { match => [ "address", "(HOST=%{DATA:address_srchost})" ] }
grok { match => [ "address", "(PORT=%{DATA:address_psrcport})" ] }
} else {
grok {
break_on_match => false
match => [ "msg_txt", "%{URIHOST} %{TIME} * %{DATA:action} * %{DATA:instance_name} * %{GREEDYDATA:result}" ]
}
}
}
}


(system) #10