Logstash kafka input plugin doesn't pull data

Hi all,

I have an ubuntu machine with the following containers and settings, but logstash doesn't pull events from kafka at all, using the same configuration on a machine witch the services install directly on (not docker images) works fine.

Your advice will be much appreciated!

Thanks

CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
3c579260d187 logstash:2.4.0 "/docker-entrypoin..." 34 minutes ago Up 14 minutes logstash
b4529a0f2a41 confluentinc/cp-kafka:3.2.1 "/etc/confluent/do..." 44 minutes ago Up 14 minutes 0.0.0.0:9092->9092/tcp kafka
4bc859aaae5e confluentinc/cp-zookeeper:3.2.1 "/etc/confluent/do..." 2 hours ago Up 14 minutes 2888/tcp, 0.0.0.0:2181->2181/tcp, 3888/tcp zookeeper
8e6cba89c688 elasticsearch:2.4.3 "/docker-entrypoin..." 2 hours ago Up 14 minutes 0.0.0.0:9200->9200/tcp, 0.0.0.0:9300->9300/tcp elasticsearch


docker-compose:

version: '2'

services:
aio-elasticsearch:
container_name: "elasticsearch"
image: elasticsearch:2.4.3
ports:
- "9200:9200"
- "9300:9300"
networks:
- aionet
restart: unless-stopped

aio-logstash:
container_name: "logstash"
image: logstash:2.4.0
volumes: ["./logstash.conf:/opt/logstash/logstash.conf:ro"]
command: -f /opt/logstash/logstash.conf --debug
links:
- aio-zk
- aio-kafka
networks:
- aionet
restart: unless-stopped

aio-zk:
image: "confluentinc/cp-zookeeper:3.2.1"
container_name: zookeeper
ports:
- "2181:2181"
networks:
- aionet
restart: unless-stopped
environment:
ZOOKEEPER_CLIENT_PORT: 2181
ZOOKEEPER_TICK_TIME: 2000
ZOOKEEPER_SYNC_LIMIT: 2

aio-kafka:
image: "confluentinc/cp-kafka:3.2.1"
container_name: kafka
ports:
- "9092:9092"
networks:
- aionet
restart: unless-stopped
environment:
KAFKA_ZOOKEEPER_CONNECT: 'zookeeper:2181'
KAFKA_ADVERTISED_LISTENERS: 'PLAINTEXT://0.0.0.0:9092'
KAFKA_BROKER_ID: 2

>>> ADD MORE DOCKERIZED CORE SERVICES HERE <<<

networks:

The network over which all Docker services will communicate

aionet:
driver: bridge


root@3c579260d187:/# /opt/logstash/bin/logstash-plugin list --verbose logstash-input-kafka
logstash-input-kafka (5.1.6)


input {

kafka {
    topics => ['test']
    group_id => 'test_1'
    bootstrap_servers => 'kafka:9092'
}

}
filter {
date {
match => [ "timestamp", "ISO8601" ]
}
metrics {
meter => "events"
add_tag => "metric"
}
mutate {
rename => { "_id" => "doc_id" }
}
}

output {
# only emit events with the 'metric' tag
if "metric" in [tags] {
stdout {
codec => line {
format => "rate: %{[events][rate_1m]}, count: %{[events][count]}"
}
}
}

else {
    elasticsearch {
        hosts => ["elasticsearch"]
        index => "events-%{+YYYY-MM-dd}-v%{schema_version}"
        routing => "%{org_pk}"
        document_id => "%{doc_id}"
        flush_size => "100"
        idle_flush_time => "1"
        document_type => "c_levent"
    }

    stdout { codec => rubydebug }

}

}

Try setting up this property in docker-compose.yml.
KAFKA_ADVERTISED_HOST_NAME

This should point to the docker IP & not localhost.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.