Unable to read webogic logs

Hi All,

I am new to filebeat and am trying to configure ELK stack. Right now I have configured filebeat to read from weblogic logs. But am unable to read anything from the logs.

Below are the configuration details. Please help me

filebeat.config.modules: 
  path: D:\ProgramFiles\filebeat\modules.d\*.yml
  reload.enabled: true
  reload.period: 60s
filebeat.prospectors: 
  - 
    enabled: true
    exclude_lines: 
      - ^DEBUG
      - ^WARNING
    include_lines: 
      - ^ERROR
      - ^FATAL
      - ^INFO
      - ^NOTICE
      - ^SEVERE
    multiline.match: after
    multiline.negate: false
    multiline.pattern: ^\####</%{MONTH}/%{MONTHDAY}/%{YEAR} %{TIME}>
    paths: 
      - D:\Bea\user_projects\domains\domain912\servers\AdminServer\logs\*.log
    type: log
logging.level: info
logging.metrics.enabled: true
logging.metrics.period: 30s
logging.selectors: 
  - "*"
output.elasticsearch: 
  enabled: true
  hosts: 127.0.0.1:9200
  template.name: filebeat
  template.overwrite: false
  template.path: filebeat.template.json
path.config: "${path.home}"
path.data: "${path.home}/data"
path.home: D:\ProgramFiles\filebeat
path.logs: "${path.home}/logs"
setup.kibana: 
  host: 127.0.0.1:5601
setup.template.settings: 
  index.number_of_shards: 3

Also please find the line from the log file that I am trying to read. Am I missing something here. Please help

####<Jan 11, 2018 6:48:36 AM GMT> <Notice> <WebLogicServer> <HYD-DT-23> <AdminServer> <main> <<WLS Kernel>> <> <> <1515653316178> <BEA-000365> <Server state changed to ADMIN>

Could you please share the logs of Filebeat?

Here is the content of the filebeat log file

2018-01-11T14:59:19+05:30 INFO Home path: [D:/ProgramFiles/filebeat] Config path: [D:/ProgramFiles/filebeat] Data path: [D:/ProgramFiles/filebeat/data] Logs path: [D:/ProgramFiles/filebeat/logs]
2018-01-11T14:59:19+05:30 INFO Metrics logging every 30s
2018-01-11T14:59:19+05:30 INFO Beat UUID: e2f77828-f949-4dd6-895c-1ade32f11f02

2018-01-11T14:59:45+05:30 INFO Home path: [D:/ProgramFiles/filebeat] Config path: [D:/ProgramFiles/filebeat] Data path: [D:/ProgramFiles/filebeat/data] Logs path: [D:/ProgramFiles/filebeat/logs]
2018-01-11T14:59:45+05:30 INFO Metrics logging every 30s
2018-01-11T14:59:45+05:30 INFO Beat UUID: e2f77828-f949-4dd6-895c-1ade32f11f02
2018-01-11T14:59:45+05:30 INFO Setup Beat: filebeat; Version: 6.1.1
2018-01-11T14:59:45+05:30 INFO Elasticsearch url: http://localhost:9200
2018-01-11T14:59:45+05:30 INFO Beat name: HYD-DT-23
2018-01-11T14:59:45+05:30 INFO Elasticsearch url: http://localhost:9200
2018-01-11T14:59:46+05:30 INFO Connected to Elasticsearch version 6.1.1
2018-01-11T14:59:46+05:30 INFO Template already exists and will not be overwritten.

To me it seems like the path to the log files are not correct. FB does not notify if it cannot find files. It is a good practice to alwasy double check paths.

Could you start FB using filebeat -e -d "*" and paste the whole output?

1 Like

Hi KVCH,

Thank you very much for your time. Here is the output for the command

2018/01/12 13:01:44.982067 beat.go:436: INFO Home path: [D:\ProgramFiles\filebeat] Config path: [D:\ProgramFiles\filebeat] Data path: [D:\ProgramFiles\filebeat\data] Logs path: [D:\ProgramFiles\filebeat\logs]
2018/01/12 13:01:44.983067 beat.go:463: DBG [beat] Beat metadata path: D:\ProgramFiles\filebeat\data\meta.json
2018/01/12 13:01:44.983067 metrics.go:23: INFO Metrics logging every 30s
2018/01/12 13:01:44.984067 beat.go:443: INFO Beat UUID: e2f77828-f949-4dd6-895c-1ade32f11f02
2018/01/12 13:01:44.984067 beat.go:203: INFO Setup Beat: filebeat; Version: 6.1.1
2018/01/12 13:01:44.984067 beat.go:215: DBG [beat] Initializing output plugins
2018/01/12 13:01:44.984067 processor.go:49: DBG [processors] Processors:
2018/01/12 13:01:44.986067 logger.go:18: DBG [publish] start pipeline event consumer
2018/01/12 13:01:44.987067 module.go:76: INFO Beat name: HYD-DT-23
2018/01/12 13:01:44.989067 beat.go:276: INFO filebeat start running.
2018/01/12 13:01:44.989067 registrar.go:88: INFO Registry file set to: D:\ProgramFiles\filebeat\data\registry
2018/01/12 13:01:44.989067 service_windows.go:51: DBG [service] Windows is interactive: true
2018/01/12 13:01:44.990067 registrar.go:108: INFO Loading registrar data from D:\ProgramFiles\filebeat\data\registry
2018/01/12 13:01:44.990067 registrar.go:119: INFO States Loaded from registrar: 3
2018/01/12 13:01:44.990067 filebeat.go:261: WARN Filebeat is unable to load the Ingest Node pipelines for the configured modules because the Elasticsearch output is not configured/enabled. If you have already loaded the Ingest Node pipelines or are using Logstash pipelines, you can ignore this warning.
2018/01/12 13:01:44.991067 crawler.go:48: INFO Loading Prospectors: 4
2018/01/12 13:01:44.991067 processor.go:49: DBG [processors] Processors:
2018/01/12 13:01:44.991067 config.go:178: DBG [prospector] recursive glob enabled
2018/01/12 13:01:44.991067 prospector.go:120: DBG [prospector] exclude_files: []. Number of stats: 3
2018/01/12 13:01:44.992067 state.go:81: DBG [prospector] New state added for D:\Bea\user_projects\domains\domain912\servers\AdminServer\logs\access.log
2018/01/12 13:01:44.992067 prospector.go:141: DBG [prospector] Prospector with previous states loaded: 1
2018/01/12 13:01:44.992067 prospector.go:111: DBG [prospector] File Configs: [D:\Bea\user_projects\domains\domain912\servers\AdminServer\logs\access.log]
2018/01/12 13:01:44.992067 prospector.go:87: INFO Starting prospector of type: log; ID: 8920572841936861400
2018/01/12 13:01:44.992067 processor.go:49: DBG [processors] Processors:
2018/01/12 13:01:44.993067 config.go:178: DBG [prospector] recursive glob enabled
2018/01/12 13:01:44.993067 prospector.go:120: DBG [prospector] exclude_files: []. Number of stats: 3
2018/01/12 13:01:44.993067 prospector.go:141: DBG [prospector] Prospector with previous states loaded: 0
2018/01/12 13:01:44.993067 prospector.go:111: DBG [prospector] File Configs: [D:\Bea\user_projects\domains\domain912\servers\AdminServer\logs\diagnostic.log]
2018/01/12 13:01:44.993067 prospector.go:87: INFO Starting prospector of type: log; ID: 1892114427901672422
2018/01/12 13:01:44.993067 processor.go:49: DBG [processors] Processors:
2018/01/12 13:01:44.994067 config.go:178: DBG [prospector] recursive glob enabled
2018/01/12 13:01:44.994067 prospector.go:120: DBG [prospector] exclude_files: []. Number of stats: 3
2018/01/12 13:01:44.991067 registrar.go:150: INFO Starting Registrar
2018/01/12 13:01:44.992067 prospector.go:147: DBG [prospector] Start next scan
2018/01/12 13:01:44.994067 prospector.go:147: DBG [prospector] Start next scan
2018/01/12 13:01:44.995067 prospector.go:168: DBG [prospector] Prospector states cleaned up. Before: 0, After: 0
2018/01/12 13:01:44.995067 prospector.go:141: DBG [prospector] Prospector with previous states loaded: 0
2018/01/12 13:01:44.995067 prospector.go:111: DBG [prospector] File Configs: [D:\Bea\user_projects\domains\domain912\servers\AdminServer\logs\server.log]
2018/01/12 13:01:44.995067 prospector.go:87: INFO Starting prospector of type: log; ID: 16454091367493209274
2018/01/12 13:01:44.996067 processor.go:49: DBG [processors] Processors:
2018/01/12 13:01:44.996067 config.go:178: DBG [prospector] recursive glob enabled
2018/01/12 13:01:44.996067 prospector.go:120: DBG [prospector] exclude_files: []. Number of stats: 3
2018/01/12 13:01:44.996067 prospector.go:141: DBG [prospector] Prospector with previous states loaded: 0
2018/01/12 13:01:44.996067 prospector.go:111: DBG [prospector] File Configs: [D:\Bea\user_projects\domains\domain912\servers\AdminServer\logs\stdout.txt]
2018/01/12 13:01:44.997067 prospector.go:87: INFO Starting prospector of type: log; ID: 8637686444003508277
2018/01/12 13:01:44.997067 crawler.go:82: INFO Loading and starting Prospectors completed. Enabled prospectors: 4
2018/01/12 13:01:44.996067 prospector.go:147: DBG [prospector] Start next scan
2018/01/12 13:01:44.997067 prospector.go:168: DBG [prospector] Prospector states cleaned up. Before: 0, After: 0
2018/01/12 13:01:44.997067 prospector.go:147: DBG [prospector] Start next scan
2018/01/12 13:01:44.997067 prospector.go:168: DBG [prospector] Prospector states cleaned up. Before: 0, After: 0
2018/01/12 13:01:44.995067 registrar.go:200: DBG [registrar] Processing 1 events
2018/01/12 13:01:44.998067 registrar.go:193: DBG [registrar] Registrar states cleaned up. Before: 3, After: 3
2018/01/12 13:01:44.998067 registrar.go:228: DBG [registrar] Write registry file: D:\ProgramFiles\filebeat\data\registry
2018/01/12 13:01:44.995067 prospector.go:361: DBG [prospector] Check file for harvesting: D:\Bea\user_projects\domains\domain912\servers\AdminServer\logs\access.log
2018/01/12 13:01:45.000067 prospector.go:447: DBG [prospector] Update existing file for harvesting: D:\Bea\user_projects\domains\domain912\servers\AdminServer\logs\access.log, offset: 0
2018/01/12 13:01:45.002067 prospector.go:501: DBG [prospector] File didn't change: D:\Bea\user_projects\domains\domain912\servers\AdminServer\logs\access.log
2018/01/12 13:01:45.005067 prospector.go:168: DBG [prospector] Prospector states cleaned up. Before: 1, After: 1
2018/01/12 13:01:45.053067 registrar.go:253: DBG [registrar] Registry file updated. 3 states written.
2018/01/12 13:01:54.995067 prospector.go:124: DBG [prospector] Run prospector
2018/01/12 13:01:54.995067 prospector.go:147: DBG [prospector] Start next scan
2018/01/12 13:01:54.996067 prospector.go:168: DBG [prospector] Prospector states cleaned up. Before: 0, After: 0
2018/01/12 13:01:54.997067 prospector.go:124: DBG [prospector] Run prospector
2018/01/12 13:01:54.997067 prospector.go:147: DBG [prospector]

It seems to me that FB can find your log files. However, all of the had been read before and no new log has arrived since. If you want FB to reread the logs you can delete the registry files, where the states are saved. The registry file is this: D:\ProgramFiles\filebeat\data\registry.

Also, I have seen this warning in your logs:
WARN Filebeat is unable to load the Ingest Node pipelines for the configured modules because the Elasticsearch output is not configured/enabled. If you have already loaded the Ingest Node pipelines or are using Logstash pipelines, you can ignore this warning.
Could you check if your ES settings are correct?

1 Like

Hi,

Apologies for the delay. But I am using logstash to redirect the out put instead of elastic search. And I also realized that the name of the log file was suppoed to e AdminServr.log. But i have mistakenly given it as server.log.

Here is the console output after I have changed the name.

2018/01/16 09:16:17.408238 crawler.go:82: INFO Loading and starting Prospectors completed. Enabled prospectors: 4
2018/01/16 09:16:17.406238 processor.go:275: DBG [publish] Publish event: {
  "@timestamp": "2018-01-16T09:16:17.406Z",
  "@metadata": {
    "beat": "filebeat",
    "type": "doc",
    "version": "6.1.1"
  },
  "tags": [
    "weblogic-server-log"
  ],
  "prospector": {
    "type": "log"
  },
  "beat": {
    "name": "HYD-DT-23",
    "hostname": "HYD-DT-23",
    "version": "6.1.1"
  },
  "source": "D:\\Bea\\user_projects\\domains\\domain912\\servers\\AdminServer\\logs\\AdminServer.log",
  "offset": 691,
  "message": "####\u003cJan 16, 2018 9:02:08 AM GMT\u003e \u003cNotice\u003e \u003cWebLogicServer\u003e \u003c\u003e \u003c\u003e \u003cmain\u003e \u003c\u003e \u003c\u003e \u003c\u003e \u003c1516093328773\u003e \u003cBEA-000365\u003e \u003cServer state changed to STARTING\u003e "
}
2018/01/16 09:16:17.452243 processor.go:275: DBG [publish] Publish event: {
  "@timestamp": "2018-01-16T09:16:17.451Z",

Thank you very much I will update if the setting has worked completely by end of today (for me).

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.