I'm curious if anyone else has faced this or if theres a recommended approach:
Here is the setup:
- Docker Swarm node runs several docker containers
- Logstash parses all docker containers on the node
- Logstash pushes to a kafka topic based on the docker service name
Here is the Issue:
Due to a constraint in the architecture, the kafka topics aren't auto created so if a docker service is deployed to the swarm node without a kafka topic pre-created, all other docker service logs aren't pushed. Logstash throws:
[2019-01-15T00:00:06,345][WARN ][org.apache.kafka.clients.NetworkClient] [Producer clientId=producer-2] Error while fetching metadata with correlation id 2371908 : {topic1234=UNKNOWN_TOPIC_OR_PARTITION}
My initial thought is to validate that the kafka topic exists every time however that seems like a lot of extra processing on the output and will slow the output.
Any recommendations?