i use filebeat 7.8.0 in Docker-Compose
OS rhel 7
When I try to write logs for multiple topics in kafka, the logs are added to kafka (always one topic (containerlogs) with no selection)
logs are received at the time of launch and no more of them are added to the kafka until the container is restarted
flibeat.yml
name: filebeat.host
logging.level: info
filebeat.inputs:
- type: log
enabled: true
paths:
- /var/lib/docker/containers/*/*-json.log
fields:
log_topic: 'containerlogs'
- type: log
enabled: true
fields:
log_topic: 'kafkalogs'
paths:
- /var/log/kafka/*.log
#
output.kafka:
hosts: [ "kafka_sasl_ssl_server:9093" ]
username: 'filebeat'
password: 'filebeat-password'
ssl.enabled: true
ssl.certificate_authorities: ["root-ca.crt"]
topic: '%{[fields.kafka_topic]}'
compression: none
max_retries: -1
backoff.max: 10s
required_acks: 1
fields are visible in the ElasticSearch server (with scheme filebeat->kafka<-logstash->elasticsearch)
"host": {
"name": "npz-09.vm.cmx.ru"
},
"@version": "1",
"fields": {
"log_topic": "containerlogs"
},
"@timestamp": "2020-07-23T09:29:33.928Z"
},
"fields": {
"@timestamp": [
"2020-07-23T09:29:33.928Z"
]
},
if change topic: 'containerlogs' it works correctly
Help me please correct use fileds setting and mapping multiple kafka topics.