Hi, I am in the process of implementing Filebeat/Elasticsearch (not using logstash) to index my logs and create some graphics in Kibana.
I have set up the system
module in Filebeat which works fine and sends the logs to the filebeat* index but I have another log file that I would like to index but in a separate index using hand made template and ingest node. I have the following filebeat.yml:
filebeat.inputs:
- type: filestream
# Change to true to enable this input configuration.
enabled: true
# Paths that should be crawled and fetched. Glob based paths.
paths:
- /var/log/user.log
filebeat.config.modules:
# Glob pattern for configuration loading
path: ${path.config}/modules.d/*.yml
# Set to true to enable config reloading
reload.enabled: false
# Period on which files under path should be checked for changes
#reload.period: 10s
output.elasticsearch:
# Array of hosts to connect to.
hosts: ["localhost:9200"]
# Protocol - either `http` (default) or `https`.
protocol: "http"
pipelines:
-pipeline: "test"
when.contains:
#My thinking here is that each line in user.log contains this string so I should be able to use that to separate from logs defined in system.yml, just not sure if the field name is correct since this is coming straight from the log file.
message: "mysql-server_auditing"
indices:
-index: "test"
when.contains:
message: "mysql-server_auditing"
processors:
- add_host_metadata:
when.not.contains.tags: forwarded
setup.template.name: "test"
setup.template.pattern: "test"
setup.ilm.enabled: "false"
When I run filebeat then everything gets sent to the filebeat* index and nothing to the test index (I have created a template and ingest node for the test), the test index doesn't even get created.
How can I send the contents of user.log to the test index but the contents of the logs defined in modules.d/system.yml still get sent to the filebeat* index?
Using the latest versions of all, 7.14.0