No connection between Kafka and Logstash

Hi,

I had a working configuration:

  • Logstash Standard Input -> Output to Elasticsearch -> Kibana
    I started Logstash and entered some text (like "test1") in the standard input. Logstash wrote it into Elasticsearch and I could see it in Kibana.

Then I changed the configuration

  • Logstash Kafka Input -> Output to Elasticsearch -> Kibana
    This does not work!

Docker-Compose-File:

  zookeeper:
    container_name: dev_zookeeper
    image: 31z4/zookeeper:latest
    restart: always
    hostname: zookeeper
    ports:
      - 2181:2181
    volumes:
      - "/data/zookeeper-data/:/usr/share/elasticsearch/data"

  # ************************************************************************** #
  # 1 of 3 brookers #

  kafka0:
    container_name: dev_kafka_node0
    image: myImage
    ports:
      - 9090:9091                                          # External Port / Internal Port  
    environment:
      ZOOKEEPER_IP: 192.168.194.134
      ZOOKEEPER_PORT: 2181
      KAFKA_BROOKER_ID: 0                                  # Unique Brooker ID
      KAFKA_LISTENERS_IP: localhost                        # Internal IP
      KAFKA_LISTENERS_PORT: 9091                           # Internal Port
      KAFKA_ADVERTISED_LISTENERS_IP: 192.168.194.134       # External IP
      KAFKA_ADVERTISED_LISTENERS_PORT: 9090                # Extenral Port
    depends_on:
      - zookeeper

  # ************************************************************************* #
  # 1 of 3 nodes #

  elasticsearch0:
    container_name: dev_es_node0
    image: docker.elastic.co/elasticsearch/elasticsearch:6.2.4
    ports:
      - 9200:9200                                          # REST
      - 9300:9300                                          # Communication between nodes
    environment:
    - ES_JAVA_OPTS=-Xms512m -Xmx512m
    - cluster.name=devcluster
    - node.name=devnode0
    - node.master=true
    - node.data=true
    - discovery.zen.ping.unicast.hosts=dev_es_node0:9300,dev_es_node1:9301,dev_es_node2:9302
    volumes:

      logstash:
        container_name: dev_logstash
        image: docker.elastic.co/logstash/logstash:6.2.4
        ports:
        - 5000:5000
        - 5001:5001
        environment:
        - XPACK_MONITORING_ENABLED=false
        - ELASTICSEARCH_HOST=192.168.194.134
        - ELASTICSEARCH_PORT=9200
        volumes:    
         - ./logstash-pipeline/logstash.conf:/usr/share/logstash/pipeline/logstash.conf

  # ************************************************************************* #

logstash.conf

input{
  kafka{
    bootstrap_servers => "192.168.194.134:9090"
    topics => ["Test1","Test2","Test3"]
    auto_offset_reset => "earliest"
    codec => "json"
  }
}

output {
  stdout {
    codec => rubydebug
  }
  elasticsearch {
    hosts => [ "192.168.194.134:9200" ]
  }
}

Test:
./kafka-topics.sh --zookeeper 192.168.194.134:2181
--create --topic "Test1"
--partitions 1
--replication-factor 3

=> The topic will be created (I can also list the topics), but nothing happens in Logstash.

Thanks a lot for your help!

Did you put JSON-encoded events into the kafka topics, or did you merely create the topics?

I tried today the following command but it hat no effect in logstash and after some time I get the following error message:

There is no output in logstash docker container:

Independently of the json format. Logstash should print on the standard output something, shouldnt it?

But on the other hand I can execute kafka-kopics.sh without an error (but also without any output in logstash)

That fixed the problem: KAFKA_LISTENERS_IP: 0.0.0.0

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.