Siebel log monitoring through ELK stack

Hi All,

Using ELK stack, we are monitoring all our Weblogic & SOA logs. Now we are planning to monitor SIEBEL logs using ELK stack.

For every 5 mins, logs with format " /tmp/htimObjMgr_enu_0299_*.log " are generating with below content.

2021 2016-05-10 01:18:16 0000-00-00 00:00:00 -0400 00000000 001 003f 0001 09 htimObjMgr_enu 2121270867 39010 -1136657552 /tmp/htimObjMgr_enu_2023_2121270867.log 8.1.1.11 [23030] ENU
ObjMgrLog Error 1 0020552a5713208d:0 2016-05-10 01:18:16 (cscfg.cpp (165)) SBL-CSR-00418: Communication: User is not associated with any communication configuration in the database.
ObjMgrLog Error 1 0020552a5713208d:0 2016-05-10 01:19:14 (sweview.cpp (1884)) SBL-UIF-00401: View: CUT Home Page View (CME) does not contain applet: .

Prepared .conf file with below code. Data is saving in Elasticserach. But not able to see that in KIBANA.

Please assist me and provide solution to monitor SIEBEL logs.

input {
stdin {
type => "stdin-type"
}
file {
type => "htim_log"
path => ["/tmp/htim*.log"]
}
}
filter {
multiline {
pattern => "\n^[A-Z]"
what => "previous"
}
grok {
type => "htim_log"
pattern => ["\A%{WORD:EventType}%{SPACE}%{WORD:EventSubType}%{SPACE}%{INT:Severity}%{SPACE}%{WORD:SARMID}%{NOTSPACE}%{SPACE}%{PROG:EventDate}%{SPACE}%{TIME:EventTime}%{SPACE}%{GREEDYDATA:LogMessage}"]
add_field => ["Project", "Siebel"]
add_field => ["Environment", "prod14"]
add_field => ["Lifecycle Status", "Production"]
add_field => ["Location", "NC"]
add_field => ["Log Name", "prod14 htim Log"]
add_field => ["Server", "usncx295"]
}
mutate {
uppercase => [ "EventSubType" ]
uppercase => [ "LogMessage" ]
}
}
output {
elasticsearch
{
host => server135
protocol => "http"
}
}

Thanks In advance.

Regards,
Bharath

How do you know this?

Are you sure the data is there?

Yes.
Because i can see the Fields which i mention in pattern are showing in KIBANA fields list. Also running with Verbose option is showing fields peoperly.

Maybe your time frame is wrong, have you tried changing that?

I'm not clear. Could you please elaborate ..

In KB you define a time range for the data you are looking at. Perhaps you need to adjust that?

Please try the below filter in your configuration file:

filter {
grok {
pattern => ["\A%{WORD:EventType}%{SPACE}%{WORD:EventSubType}%{SPACE}%{INT:Severity}%{SPACE}%{WORD:SARMID}%{NOTSPACE}%{SPACE}%{PROG:EventDate}%{SPACE}%{TIME:EventTime}%{SPACE}%{GREEDYDATA:LogMessage}"]
add_field =>[ "timestamp" ,"%{EventDate} %{EventTime}" ]
}
date {
match => [ "timestamp", "yyyy-MM-dd HH:mm:ss" ]
}
}

Date filter is needed to over ride ElasticSearch's timestamp field with the log file timestamp.

Thanks,
Vinod

1 Like