I'm very new to elasticsearch, so I may be approaching this problem incorrectly.
I've got a couple logs with different log formats. It's pretty easy to create a pipeline with a grok pattern for each, but I'd like to send each log type to a different index.
I assume this is a common pattern. How do other folks handle this?
Ideally I'd like to do this with just filebeat and elasticsearch.
Nevermind, I found the answer. I had to use inputs instead of the module in my filebeat.yml file. Final relevant extracts:
filebeat.inputs:
- type: log
paths:
- "C:/tester.log"
fields:
type: "applog"
# VERY IMPORTANT: don't set fields_under_root. Obvious now, not at the time...
setup.template:
name: "%{[fields.type]:filebeat}-%{[beat.version]}"
pattern: "%{[fields.type]:filebeat}-%{[beat.version]}-*"
[...]
output.elasticsearch:
# Array of hosts to connect to.
hosts: ["http://server:9200"]
index: "filebeat-%{[beat.version]}-%{+yyyy.MM.dd}"
indices:
- index: "applog-%{[beat.version]}-%{+yyyy.MM.dd}"
when.equals:
fields.type: "applog"
pipelines:
- pipeline: "filebeat-6.3.2-applog-log-default"
when.equals:
fields.type: "applog"