Logstash kafka input not working

Hello,
I have this very simple logstash config:

input 
{ 
    kafka 
    { 
        topics => ["analytics"]
        bootstrap_servers => "HSTSCSRP2:9092"
		auto_offset_reset => "earliest"
    } 
}

output 
{
	elasticsearch 
    { 
        hosts => "HSTSCSRP1:9200"
        index => "events_2019_05_07"
    }
}

And I start it via bin\logstash -f config\logstash.conf

No matter what I do, I just can't see anything sent to ES. I tried to replace output to

output {
  stdout {}
}

And I don't see any output.
I can confirm the topic exists and new events are sent every second - if I open a Kafka consumer with the same topic and host, I see output:
c:\kafka\bin\windows\kafka-console-consumer.bat --zookeeper HSTSCSRP2 --bootstrap-server HSTSCSRP2:9092 --topic analytics
^ this works.

Logstash version is:

Sending Logstash's logs to C:/logstash/logs which is now configured via log4j2.properties
logstash 6.2.3
jruby 9.1.13.0 (2.3.3) 2017-09-06 8e1c115 Java HotSpot(TM) 64-Bit Server VM 25.211-b12 on 1.8.0_211-b12 +indy +jit [mswin32-x86_64]
java 1.8.0_211 (Oracle Corporation)
jvm Java HotSpot(TM) 64-Bit Server VM / 25.211-b12

There are no errors in logs.. HSTSCSRP2 is same server btw, I'm working on localhost.

Any help would be appreciated.
Thanks!

Adding some info (first post was limited in length) - this is logstash command output:

C:\logstash>bin\logstash -f config\logstash.conf
Sending Logstash's logs to C:/logstash/logs which is now configured via log4j2.properties
[2019-05-08T08:27:12,462][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"C:/logstash/modules/fb_apache/configuration"}
[2019-05-08T08:27:12,478][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"C:/logstash/modules/netflow/configuration"}
[2019-05-08T08:27:12,650][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2019-05-08T08:27:13,150][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"6.2.3"}
[2019-05-08T08:27:13,416][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2019-05-08T08:27:14,916][INFO ][logstash.pipeline        ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2019-05-08T08:27:15,275][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://HSTSCSRP1:9200/]}}
[2019-05-08T08:27:15,353][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://HSTSCSRP1:9200/, :path=>"/"}
[2019-05-08T08:27:15,509][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://HSTSCSRP1:9200/"}
[2019-05-08T08:27:15,556][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2019-05-08T08:27:15,556][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
[2019-05-08T08:27:15,572][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2019-05-08T08:27:15,588][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2019-05-08T08:27:15,620][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//HSTSCSRP1:9200"]}
[2019-05-08T08:27:15,666][INFO ][logstash.pipeline        ] Pipeline started succesfully {:pipeline_id=>"main", :thread=>"#<Thread:0x5a44e42a sleep>"}
[2019-05-08T08:27:15,744][INFO ][logstash.agent           ] Pipelines running {:count=>1, :pipelines=>["main"]}
[2019-05-08T08:27:15,744][INFO ][org.apache.kafka.clients.consumer.ConsumerConfig] ConsumerConfig values:
        auto.commit.interval.ms = 5000
        auto.offset.reset = earliest
        bootstrap.servers = [HSTSCSRP2:9092]
        check.crcs = true
        client.id = logstash-0
        connections.max.idle.ms = 540000
        enable.auto.commit = true
        exclude.internal.topics = true
        fetch.max.bytes = 52428800
        fetch.max.wait.ms = 500
        fetch.min.bytes = 1
        group.id = logstash
        heartbeat.interval.ms = 3000
        interceptor.classes = null
        internal.leave.group.on.close = true
        isolation.level = read_uncommitted
        key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer
        max.partition.fetch.bytes = 1048576
        max.poll.interval.ms = 300000
        max.poll.records = 500
        metadata.max.age.ms = 300000
        metric.reporters = []
        metrics.num.samples = 2
        metrics.recording.level = INFO
        metrics.sample.window.ms = 30000
        partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor]
        receive.buffer.bytes = 65536
        reconnect.backoff.max.ms = 1000
        reconnect.backoff.ms = 50
        request.timeout.ms = 305000
        retry.backoff.ms = 100
        sasl.jaas.config = null
        sasl.kerberos.kinit.cmd = /usr/bin/kinit
        sasl.kerberos.min.time.before.relogin = 60000
        sasl.kerberos.service.name = null
        sasl.kerberos.ticket.renew.jitter = 0.05
        sasl.kerberos.ticket.renew.window.factor = 0.8
        sasl.mechanism = GSSAPI
        security.protocol = PLAINTEXT
        send.buffer.bytes = 131072
        session.timeout.ms = 10000
        ssl.cipher.suites = null
        ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
        ssl.endpoint.identification.algorithm = null
        ssl.key.password = null
        ssl.keymanager.algorithm = SunX509
        ssl.keystore.location = null
        ssl.keystore.password = null
        ssl.keystore.type = JKS
        ssl.protocol = TLS
        ssl.provider = null
        ssl.secure.random.implementation = null
        ssl.trustmanager.algorithm = PKIX
        ssl.truststore.location = null
        ssl.truststore.password = null
        ssl.truststore.type = JKS
        value.deserializer = class org.apache.kafka.common.serialization.StringDeserializer

[2019-05-08T08:27:15,806][INFO ][org.apache.kafka.common.utils.AppInfoParser] Kafka version : 1.0.0
[2019-05-08T08:27:15,806][INFO ][org.apache.kafka.common.utils.AppInfoParser] Kafka commitId : aaa7af6d4a11b29d

Anyone? Any tips on how I can debug this?
Sorry for bumping it got buried real quick..

Thanks.

Bump again - I noticed this part that says thread is in sleep. I have another machine with the exact same config and versions where it works, and there it says 'run'.

Maybe its related?
I really don't understand whats going on its like logstash don't work at all, but with the same config and installation on a different machine everything works. No error in logs.

Any help?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.