We have Elasticstack 8.15 in use so that client Filebeat output goes to the Kafka and from there to the Elasticsearch. Filebeat sends different logs to separate Kafka topics. We are using following configuration in "filebeat.yml" (I'll only write lines that are meaningful)
We'd like to Start using Elastic Agent with Fleet, which is already working, but we have one issue: How to configure Kafka output and dynamic topics? I know that it should be done by adding Custom Logs integration to the Agent policy and the add some code to the "processors"-field. Has someone figured out how this need to be done?
The main difference is that you would need one Custom Log Integration per input in filebeat, so in your example you have 2 inputs, so you would need to add one Custom Log integration for each one of the inputs.
PS C:\Program Files\Elastic\Agent> .\elastic-agent.exe status
┌─ fleet
│ └─ status: (STARTING)
└─ elastic-agent
├─ status: (DEGRADED) 1 or more components/units in a failed state
└─ log-b2d12f01-f359-45dd-be6c-e67c43b81cc9
├─ status: (HEALTHY) Healthy: communicating with pid '4504'
├─ log-b2d12f01-f359-45dd-be6c-e67c43b81cc9
│ └─ status: (FAILED) could not start output: failed to reload output: topic '%{[fields.kafka_topic]}' is invalid, it must match '[a-zA-Z0-9._-]' accessing 'kafka'
└─ log-b2d12f01-f359-45dd-be6c-e67c43b81cc9-logfile-logs-6f3c2805-c5dd-45dc-944a-d8e101173873
└─ status: (STARTING) Starting
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.