Grok filter for Oracle OSB Server logs

Hi Guys,

i am new to ELK and have been using it for the past few days. I am using the ELK stack on Docker and the purpose is to monitor the Oracle OSB server logs (Part of Oracle SOA Product Suite) . I have been trying different grok patterns and combination however, none of them seem to be working.

There are xml payloads in the osb server logs that i was trying to capture as message fields in the final elastic search outcome through Kibana.

Attached the sample log image, the intention is to capture all valid xml payloads like '<con:fault.....' and 'ns4:getCustomerItemPriceElement'... however its not happening!!!

Kibana report:

I have tried using custom patterns but that is throwing errors :-

Pattern used was something likePatternFile

logstash grok file: grok_v1

Hi,

First of, you should not post pictures of text, it makes it pretty hard to read sometimes.

Looking at the error it seems you are not closing you match filter correctly. There is a unmatched parenthesis error. Due to the picture I cannot quote that piece of text.

Good luck.
Paul.

Pasting the logstash error below..

Sending Logstash's logs to /opt/logstash/logs which is now configured via log4j2.properties
[2018-05-10T13:44:50,141][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"/opt/logstash/modules/fb_apache/configuration"}
[2018-05-10T13:44:50,171][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"/opt/logstash/modules/netflow/configuration"}
[2018-05-10T13:44:50,984][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-05-10T13:44:51,992][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"6.2.3"}
[2018-05-10T13:44:52,883][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2018-05-10T13:44:56,890][INFO ][logstash.pipeline        ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2018-05-10T13:44:57,697][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2018-05-10T13:44:57,719][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
[2018-05-10T13:44:58,095][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2018-05-10T13:44:58,209][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2018-05-10T13:44:58,219][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
[2018-05-10T13:44:58,247][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-05-10T13:44:58,289][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-05-10T13:44:58,375][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2018-05-10T13:44:58,721][ERROR][logstash.pipeline        ] Error registering plugin {:pipeline_id=>"main", :plugin=>"#<LogStash::FilterDelegator:0x35dcb32c @metric_events_out=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 -  name: out value:0, @metric_events_in=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 -  name: in value:0, @metric_events_time=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 -  name: duration_in_millis value:0, @id=\"60a884e5c2495a8d94bb4c36d6ed6922d91072317b8ab83fe0088b721f8bcc9b\", @klass=LogStash::Filters::Grok, @metric_events=#<LogStash::Instrument::NamespacedMetric:0x356516cc @metric=#<LogStash::Instrument::Metric:0x30307650 @collector=#<LogStash::Instrument::Collector:0x6727404b @agent=nil, @metric_store=#<LogStash::Instrument::MetricStore:0x3732ca50 @store=#<Concurrent::Map:0x00000000000fb4 entries=3 default_proc=nil>, @structured_lookup_mutex=#<Mutex:0x44b5368f>, @fast_lookup=#<Concurrent::Map:0x00000000000fb8 entries=60 default_proc=nil>>>>, @namespace_name=[:stats, :pipelines, :main, :plugins, :filters, :\"60a884e5c2495a8d94bb4c36d6ed6922d91072317b8ab83fe0088b721f8bcc9b\", :events]>, @filter=<LogStash::Filters::Grok patterns_dir=>[\"./patterns\"], match=>{\"message\"=>\"<%{DATA:log_timestamp}> <%{WORD:log_level}> <%{DATA:servername}> <%{DATA:timer}> %{OSBMESSAGE:Message}\"}, id=>\"60a884e5c2495a8d94bb4c36d6ed6922d91072317b8ab83fe0088b721f8bcc9b\", enable_metric=>true, periodic_flush=>false, patterns_files_glob=>\"*\", break_on_match=>true, named_captures_only=>true, keep_empty_captures=>false, tag_on_failure=>[\"_grokparsefailure\"], timeout_millis=>30000, tag_on_timeout=>\"_groktimeout\">>", :error=>"unmatched close parenthesis: /<(?<DATA:log_timestamp>.*?)> <(?<WORD:log_level>\\b\\w+\\b)> <(?<DATA:servername>.*?)> <(?<DATA:timer>.*?)> (?<OSBMESSAGE:Message>(<\\w.*)|<.\\w.*))/m", :thread=>"#<Thread:0x2c8935a8 run>"}
[2018-05-10T13:44:58,770][ERROR][logstash.pipeline        ] Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#<RegexpError: unmatched close parenthesis: /<(?<DATA:log_timestamp>.*?)> <(?<WORD:log_level>\b\w+\b)> <(?<DATA:servername>.*?)> <(?<DATA:timer>.*?)> (?<OSBMESSAGE:Message>(<\w.*)|<.\w.*))/m>, :backtrace=>["org/jruby/RubyRegexp.java:928:in `initialize'", "/opt/logstash/vendor/bundle/jruby/2.3.0/gems/jls-grok-0.11.4/lib/grok-pure.rb:127:in `compile'", "/opt/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-grok-4.0.3/lib/logstash/filters/grok.rb:281:in `block in register'", "org/jruby/RubyArray.java:1734:in `each'", "/opt/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-grok-4.0.3/lib/logstash/filters/grok.rb:275:in `block in register'", "org/jruby/RubyHash.java:1343:in `each'", "/opt/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-grok-4.0.3/lib/logstash/filters/grok.rb:270:in `register'", "/opt/logstash/logstash-core/lib/logstash/pipeline.rb:341:in `register_plugin'", "/opt/logstash/logstash-core/lib/logstash/pipeline.rb:352:in `block in register_plugins'", "org/jruby/RubyArray.java:1734:in `each'", "/opt/logstash/logstash-core/lib/logstash/pipeline.rb:352:in `register_plugins'", "/opt/logstash/logstash-core/lib/logstash/pipeline.rb:736:in `maybe_setup_out_plugins'", "/opt/logstash/logstash-core/lib/logstash/pipeline.rb:362:in `start_workers'", "/opt/logstash/logstash-core/lib/logstash/pipeline.rb:289:in `run'", "/opt/logstash/logstash-core/lib/logstash/pipeline.rb:249:in `block in start'"], :thread=>"#<Thread:0x2c8935a8 run>"}
[2018-05-10T13:44:58,841][ERROR][logstash.agent           ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: LogStash::PipelineAction::Create/pipeline_id:mairoot@7c2ffadaa8ba:/opt/logstash/bin#

filter grok pattern that i am using is:

filter {
  grok {
  patterns_dir => ["./patterns"]
  match=>["message","<%{DATA:log_timestamp}> <%{WORD:log_level}> <%{DATA:servername}> <%{DATA:timer}> %{OSBMESSAGE:Message}"]
}
}

The Log file i am trying to analyze is capture the payload is:

<Feb 13, 2018 9:29:47 PM CST> <Error> <ALSB Logging> <BEA-000000> < [RouteTo_SO_JDE_ProcessSalesOrder_BS, _onErrorHandler-5123344876563274982-167c5a26.13ca89c97be.-7a4b, Reply with Failure, ERROR] :: fault ::: <con:fault xmlns:con="http://www.bea.com/wli/sb/context">
  <con:errorCode>BEA-380001</con:errorCode>
  <con:reason>Internal Server Error</con:reason>
  <con:location>
    <con:node>RouteTo_SO_JDE_ProcessSalesOrder_BS</con:node>
    <con:path>response-pipeline</con:path>
  </con:location>
</con:fault>> 
<Feb 13, 2018 9:29:48 PM CST> <Error> <ALSB Logging> <BEA-000000> < [RouteTo_SO_JDE_ProcessSalesOrder_BS, _onErrorHandler-5123344876563274982-167c5a26.13ca89c97be.-7a4b, Reply with Failure, ERROR] :: varRequest ::: <SOAP-ENV:Body xmlns:ns2="java:oracle.e1.bssv.util.J4100010.valueobject" xmlns:SOAP-ENV="http://schemas.xmlsoap.org/soap/envelope/" xmlns:ns1="java:oracle.e1.bssv.JP420000.valueobject" xmlns:ns4="http://oracle.e1.bssv.JP420000/" xmlns:ns3="java:oracle.e1.bssv.util.J0100010.valueobject" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
  <ns4:getCustomerItemPriceElement>
    <ns1:BusinessUnit>2000</ns1:BusinessUnit>
    <ns1:PriceAdjustmentId xsi:nil="true"/>
    <ns1:CurrencyCode xsi:nil="true"/>
    <ns1:RateExchange xsi:nil="true"/>
    <ns1:FreightHandlingCode xsi:nil="true"/>
    <ns1:DiscountTrade xsi:nil="true"/>
    <ns1:CustomerPriceGroupCode xsi:nil="true"/>
    <ns1:ZoneNumber xsi:nil="true"/>
    <ns1:RouteCode xsi:nil="true"/>
    <ns1:StopCode xsi:nil="true"/>
    <ns1:Product>
      <ns1:BusinessUnit xsi:nil="true"/>
      <ns1:LineType xsi:nil="true"/>
      <ns1:LotNumber xsi:nil="true"/>
      <ns1:Location xsi:nil="true"/>
      <ns1:UnitOfMeasureCodeVolume xsi:nil="true"/>
      <ns1:UnitOfMeasureCodeWeight xsi:nil="true"/>
      <ns1:ItemVolume xsi:nil="true"/>
      <ns1:ItemWeight xsi:nil="true"/>
      <ns1:ShipTo xsi:nil="true"/>
      <ns1:Item>
        <ns2:ItemId xsi:nil="true"/>
        <ns2:ItemProduct>571202</ns2:ItemProduct>
        <ns2:ItemCatalog xsi:nil="true"/>
        <ns2:ItemFreeForm xsi:nil="true"/>
        <ns2:ItemCustomer xsi:nil="true"/>
        <ns2:ItemDescription xsi:nil="true"/>
        <ns2:ItemUOMPrimary xsi:nil="true"/>
        <ns2:ItemUOMSecondary xsi:nil="true"/>
      </ns1:Item>
    </ns1:Product>
    <ns1:TransactionQuantity>1</ns1:TransactionQuantity>
    <ns1:UnitOfMeasureCodeTransaction>CA</ns1:UnitOfMeasureCodeTransaction>
    <ns1:UnitOfMeasureCodePricing xsi:nil="true"/>
    <ns1:DatePriceEffective>2018-02-13T00:00:00</ns1:DatePriceEffective>
    <ns1:AgreementID xsi:nil="true"/>
    <ns1:PaymentTermsCode xsi:nil="true"/>
    <ns1:PaymentInstrumentCode xsi:nil="true"/>
    <ns1:ModeOfTransportCode xsi:nil="true"/>
    <ns1:StatusCodeDuty xsi:nil="true"/>
    <ns1:EndUseCode xsi:nil="true"/>
    <ns1:LineOfBusinessCode xsi:nil="true"/>
    <ns1:PriceCode1>JM</ns1:PriceCode1>
    <ns1:PriceCode2 xsi:nil="true"/>
    <ns1:PriceCode3 xsi:nil="true"/>
    <ns1:Customer>
      <ns1:SoldTo xsi:nil="true"/>
      <ns1:ShipTo>
        <ns3:EntityId>50688260</ns3:EntityId>
        <ns3:EntityLongId xsi:nil="true"/>
        <ns3:EntityTaxId xsi:nil="true"/>
      </ns1:ShipTo>
    </ns1:Customer>
    <ns1:Processing>
      <ns1:ProcessingVersion>AEU0011</ns1:ProcessingVersion>
    </ns1:Processing>
    <ns1:Carrier xsi:nil="true"/>
  </ns4:getCustomerItemPriceElement>
</SOAP-ENV:Body>> 
<Feb 13, 2018 9:29:48 PM CST> <Error> <ALSB Logging> <BEA-000000> < [RouteTo_SO_JDE_ProcessSalesOrder_BS, _onErrorHandler-5123344876563274982-167c5a26.13ca89c97be.-7a4b, Reply with Failure, ERROR] :: body ::: <env:Body xmlns:env="http://schemas.xmlsoap.org/soap/envelope/">
  <env:Fault>
    <faultcode>env:Server</faultcode>
    <faultstring>CAUSE . . . .  The price which was retrieved through advanced pricing is
               invalid because at least one mandatory price adjustment was not
               found.
RESOLUTION. .  Determine the missing price adjustment and correct or add the
               necessary price adjustment detail records so that the mandatory
               price adjustment will be found.

CAUSE . . . .  The operation did not succeed.               .
RESOLUTION:    See jde.log or jdedebug.log for more details on causes</faultstring>
    <detail>
      <java:BusinessServiceException xmlns:java="java:oracle.e1.bssvfoundation.exception">
        <java:RequestID>10.160.15.44:5666611518519588085:36793</java:RequestID>
        <java:Message>CAUSE . . . .  The price which was retrieved through advanced pricing is
               invalid because at least one mandatory price adjustment was not
               found.
RESOLUTION. .  Determine the missing price adjustment and correct or add the
               necessary price adjustment detail records so that the mandatory
               price adjustment will be found.

CAUSE . . . .  The operation did not succeed.               .
RESOLUTION:    See jde.log or jdedebug.log for more details on causes</java:Message>
      </java:BusinessServiceException>
    </detail>
  </env:Fault>
</env:Body>> 
<Feb 13, 2018 9:29:48 PM CST> <Error> <ALSB Logging> <BEA-000000> < [RouteTo_SO_JDE_ProcessSalesOrder_BS, _onErrorHandler-5123344876563274982-167c5a26.13ca89c97be.-7a4b, Reply with Failure, ERROR] :: fault ::: <con:fault xmlns:con="http://www.bea.com/wli/sb/context">
  <con:errorCode>BEA-380001</con:errorCode>
  <con:reason>Internal Server Error</con:reason>
  <con:location>
    <con:node>RouteTo_SO_JDE_ProcessSalesOrder_BS</con:node>
    <con:path>response-pipeline</con:path>
  </con:location>
</con:fault>>

the filter key "OSBMESSAGE" is something that i am struggling to create as of now.

What are the patterns you are loading into the Grok filter, and how are they defined? The error message indicates that one of them fails to compile due to mismatched parentheses.

yes the parenthese issue have been resolved.. it was a custom pattern that i added..

Looking at the screenshots, it appears as though you are using the default lines codec, which emits one event per line of input. Since your log messages are multiline, you should use the multiline codec, and configure it to capture each chunk as a single message.

ok .. i think i should then explore multiline codec.. https://www.elastic.co/guide/en/logstash/current/plugins-codecs-multiline.html

i will try out this.... let me know please in case i can use any other references as well...

Hi,

I am following the instruction as per: GitHub - logstash-plugins/logstash-codec-multiline

but getting the below error. I am using ELK on docker for this purpose. Searching for any known issues, but havent got any till now..

the error is always

root@7c2ffadaa8ba:/opt/logstash/vendor/bundle/jruby/2.3.0/bin# ./bundle
bash: ./bundle: /home/vagrant/projects/logstash/vendor/jruby/bin/jruby: bad interpreter: No such file or directory

HI Rakesh_Js,

Don't do chmod 777 on files or directories, this is wrong on all sorts of levels :slight_smile:.

The problem is that the jruby specified in the bundle file does not exists. You can chmod it all you like but that doesn't make the file appear :rofl:

I would start to look at your vagrant image and if that has all the required packages.

Good luck,
Paul.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.