HCP 1.9.1 Installation and demo, No results found in Kibana

Since this situation feels like related to elasticsearch index, I post it here,

I have installed and working HCP 1.9.1
After that, when I try the learning and training tutorial described at:
https://docs.hortonworks.com/HDPDocuments/HCP1/HCP-1.9.1/runbook-overview/index.html

I get to the point where I'm not able to see the Squid log in elasticsearch index,

I'm really stuck at here,
https://docs.hortonworks.com/HDPDocuments/HCP1/HCP-1.9.1/add-new-telemetry/content/verify_that_the_events_are_indexed.html

default index was created, my work around was to use this: https://pastebin.com/9TPV5XQN

and I also use this: https://pastebin.com/9WSmhNAn

tcpdump -A -i ens192 src <nifi-ipaddress> from my single elasticsearch host shows this:

12:16:16.224880 IP HCP10-DataSource1.33810 > HCP9-MetronSearch1.ircu-3: Flags [P.], seq 1490977535:1490977718, ack 1135751061, win 31088, options [nop,nop,TS val 10936402 ecr 10787491], length 183
2. E....[@.@.'.

  • ..
  • ......X...C./...yp.......
  1. ...R...............y.
  2. nifi-squid....u0......squid.......................v..6-.........h1557378975.223 232 127.0.0.1 TCP_MISS/302 280 GET http://www.atmape.ru/ - HIER_DIRECT/36.86.63.182 -
  • 12:16:21.226164 ARP, Reply HCP10-DataSource1 is-at 00:50:56:99:6b:51 (oui Unknown), length 46
  1. .........PV.kQ
  • ...PV..4
  • ....................

But no result found in Kibana,

Any help thank you!

It's not really clear where Logstash comes into things here. Are you able to clarify?

Also that looks like a really old version of the stack, v5, can you upgrade?

Hello warkolm, I edited first post and deleted logstash as it may not appropriate/confusing,

I use Hortonworks HCP management pack to install elasticsearch, so its automatically use given version,

http://public-repo-1.hortonworks.com/HCP/centos7/1.x/updates/1.9.1.0/tars/metron/elasticsearch_mpack-1.9.1.0-6.tar.gz

Im not sure upgrading will make this work, but will consider it,

the resolution was to resolve the always complaining:

ERROR [KafkaApi-1001] Number of alive brokers '2' does not meet the required replication factor '3' for the offsets topic (configured via 'offsets.topic.replication.factor'). This error can be ignored if the cluster is starting up and not all brokers are up yet. (kafka.server.KafkaApis)

I have this HCP setup as demo, and only have 2 kafka brokers,

just simply goto Ambari/Kafka/configs then change offsets.topic.replication.factor=2,

restart kafka and metron,

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.