Does anyone see any issue with harvesting same file from different filebeat prospectors each having its own custom fields and set of include exclude regexs?
My usecase is, i want filebeat to parse a single file for certain set of include and exclude patterns and set different type field value. Then using when condition in filebeat config send the parsed data to different indexes in elasticsearch. Filebeat seems to do that pretty neatly but does anyone sees any issue with this approach?
Version - 5.1.1-1
filebeat:
prospectors:
-
paths:
- /var/log/test.log
input_type: log
document_type: alert
ignore_older: 24h
exclude_lines: ['^DBG']
include_lines: ['^ERROR']
scan_frequency: 10s
backoff: 1s
max_backoff: 10s
fields:
sev: "MAJOR"
label: "ERROR"
backoff_factor: 2
force_close_files: false
fields_under_root: false
close_older: 2h
-
paths:
- /var/log/test.log
input_type: log
document_type: alert
ignore_older: 24h
exclude_lines: ['java\.lang\.IllegalArgumentException: Document base']
include_lines: ['ERROR LogMananger\.repositorySelector was null']
scan_frequency: 10s
backoff: 1s
max_backoff: 10s
fields:
sev: "MINOR"
label: "repo-issue"
backoff_factor: 2
force_close_files: false
fields_under_root: false
close_older: 2h
-
paths:
- /var/log/test.log
input_type: log
document_type: alert
ignore_older: 24h
exclude_lines: ['java\.lang\.IllegalArgumentException: Document base']**
include_lines: ['SEVERE: Servlet\.service','Stopping service Catalina']**
scan_frequency: 10s
backoff: 1s
max_backoff: 10s
fields:
sev: "MINOR"
label: "Servlet_Failure"
backoff_factor: 2
force_close_files: false
fields_under_root: false
close_older: 2h
output:
elasticsearch:
hosts: ['https://xxxxxxxxxx:443']
index: filebeat-%{+yyyy.MM.dd}
indices:
- index: "alert-%{+yyyy.MM.dd}"
when.contains:
type: "alert"
name: grafana-mon-GrafanaApp-15FIOD3L34PR