ELK not working curious viral

Hi Guys,
I have tomcat installed on this server and i also installed filebeat version 6 on the same tomcat server. But I’m not able to make connection to kibana. All the elasticsearch, kibana and logstash are configured on same server 192.168.185.143

Filebeat log : I have specified kibana url ip in filebeat.yml

2019-05-12T09:10:41.301-0700 INFO instance/beat.go:280 Setup Beat: filebeat; Version: 7.0.0
2019-05-12T09:10:41.302-0700 INFO [publisher] pipeline/module.go:97 Beat name: tomcat-beat
2019-05-12T09:10:41.303-0700 INFO kibana/client.go:118 Kibana url: http://localhost:5601
2019-05-12T09:10:41.306-0700 ERROR instance/beat.go:802 Exiting: error connecting to Kibana: fail to get the Kibana version: HTTP GET request to http://localhost:5601/api/status fails: fail to execute the HTTP GET request: Get http://localhost:5601/api/status: dial tcp 127.0.0.1:5601: connect: connection refused. Response: .

Logstash logs :

[2018-12-10T02:23:01,342][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"filebeat-2019.05.12", :_type=>"_doc", :_routing=>nil}, #LogStash::Event:0x2182d719], :response=>{"create"=>{"_index"=>"filebeat-2019.05.12", "_type"=>"_doc", "id"=>"AWeW_206fmSo3qcbXu57", "status"=>400, "error"=>{"type"=>"invalid_type_name_exception", "reason"=>"Document mapping type name can't start with ''"}}}}

Elasticsearch Log :

[2018-12-09 16:17:02,589][INFO ][node ] [Hardcore] version[2.4.6], pid[39547], build[5376dca/2017-07-18T12:17:44Z]
[2018-12-09 16:17:02,589][INFO ][node ] [Hardcore] initializing ...
[2018-12-09 16:17:03,120][INFO ][plugins ] [Hardcore] modules [reindex, lang-expression, lang-groovy], plugins , sites
[2018-12-09 16:17:03,188][INFO ][env ] [Hardcore] using [1] data paths, mounts [[/ (rootfs)]], net usable_space [10.5gb], net total_space [12.9gb], spins? [unknown], types [rootfs]
[2018-12-09 16:17:03,188][INFO ][env ] [Hardcore] heap size [1015.6mb], compressed ordinary object pointers [true]
[2018-12-09 16:17:05,060][INFO ][node ] [Hardcore] initialized
[2018-12-09 16:17:05,060][INFO ][node ] [Hardcore] starting ...
[2018-12-09 16:17:05,105][INFO ][transport ] [Hardcore] publish_address {127.0.0.1:9300}, bound_addresses {[::1]:9300}, {127.0.0.1:9300}
[2018-12-09 16:17:05,108][INFO ][discovery ] [Hardcore] elasticsearch/iwa8ZuZ-SKK3nHAA1-eYSg
[2018-12-09 16:17:08,141][INFO ][cluster.service ] [Hardcore] new_master {Hardcore}{iwa8ZuZ-SKK3nHAA1-eYSg}{127.0.0.1}{127.0.0.1:9300}, reason: zen-disco-join(elected_as_master, [0] joins received)
[2018-12-09 16:17:08,170][INFO ][gateway ] [Hardcore] recovered [0] indices into cluster_state
[2018-12-09 16:17:08,178][INFO ][http ] [Hardcore] publish_address {127.0.0.1:9200}, bound_addresses {[::1]:9200}, {127.0.0.1:9200}
[2018-12-09 16:17:08,178][INFO ][node ] [Hardcore] started
[2018-12-09 16:24:41,380][INFO ][node ] [Hardcore] stopping ...
[2018-12-09 16:24:41,397][INFO ][node ] [Hardcore] stopped
[2018-12-09 16:24:41,397][INFO ][node ] [Hardcore] closing ...
[2018-12-09 16:24:41,407][INFO ][node ] [Hardcore] closed

Filebeatl.yml output

filebeat.inputs:

  • type: log
    enabled: true
    paths:
    • /var/log/*.log
    • /var/log/secure
    • /var/log/messages
    • /opt/tomcat9/apache-tomcat-9.0.13/logs/catalina.out
      setup.kibana:
      host: "192.168.185.143:5601"
      #output.elasticsearch:
      hosts: ["192.168.185.143:9200"]
      #----------------------------- Logstash output --------------------------------
      output.logstash:
      hosts: ["192.168.185.143:5044"]

Logstash File---------------
path.data: /var/lib/logstash
path.logs: /var/log/logstash

Elasticsearch file -----
cluster.name: Elastic-Server
node.name: node-1
path.data: /var/lib/elasticsearch
path.logs: /var/log/elasticsearch
bootstrap.memory_lock: true
network.host: 192.168.185.143
http.port: 9200

All the services are running and i’m able to access kibana on same server.

You're image shows a really old version of the stack, can you upgrade?

The kibana version is 4.4.2 and i'm also sharing the log of filebeat where tomcat is running. I think there is not issue from filebeat side and it's working now. if it's so how can i find that filebeat is connecting to logstash and kibana?

file beat logs :slight_smile:

May 12 15:25:15 tomcat-beat filebeat: 2019-05-12T15:25:15.086-0700 INFO [monitoring] log/log.go:144 Non-zero metrics in the last 30s {"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":3400,"time":{"ms":9}},"total":{"ticks":9170,"time":{"ms":29},"value":9170},"user":{"ticks":5770,"time":{"ms":20}}},"handles":{"limit":{"hard":4096,"soft":1024},"open":8},"info":{"ephemeral_id":"b802738b-7976-4c32-9fc8-5b9f6f74d5fc","uptime":{"ms":17490031}},"memstats":{"gc_next":5725872,"memory_alloc":4447808,"memory_total":341371472}},"filebeat":{"events":{"added":19,"done":19},"harvester":{"open_files":1,"running":1}},"libbeat":{"config":{"module":{"running":0}},"output":{"events":{"acked":19,"batches":1,"total":19},"read":{"bytes":35},"write":{"bytes":2381}},"pipeline":{"clients":1,"events":{"active":0,"published":19,"total":19},"queue":{"acked":19}}},"registrar":{"states":{"current":7,"update":19},"writes":{"success":1,"total":1}},"system":{"load":{"1":0.07,"15":0.05,"5":0.03,"norm":{"1":0.07,"15":0.05,"5":0.03}}}}}}
May 12 15:25:45 tomcat-beat filebeat: 2019-05-12T15:25:45.084-0700 INFO [monitoring] log/log.go:144 Non-zero metrics in the last 30s {"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":3410,"time":{"ms":7}},"total":{"ticks":9190,"time":{"ms":23},"value":9190},"user":{"ticks":5780,"time":{"ms":16}}},"handles":{"limit":{"hard":4096,"soft":1024},"open":8},"info":{"ephemeral_id":"b802738b-7976-4c32-9fc8-5b9f6f74d5fc","uptime":{"ms":17520031}},"memstats":{"gc_next":8426880,"memory_alloc":4288624,"memory_total":343267568}},"filebeat":{"events":{"added":1,"done":1},"harvester":{"open_files":1,"running":1}},"libbeat":{"config":{"module":{"running":0}},"output":{"events":{"acked":1,"batches":1,"total":1},"read":{"bytes":35},"write":{"bytes":999}},"pipeline":{"clients":1,"events":{"active":0,"published":1,"total":1},"queue":{"acked":1}}},"registrar":{"states":{"current":7,"update":1},"writes":{"success":1,"total":1}},"system":{"load":{"1":0.04,"15":0.05,"5":0.03,"norm":{"1":0.04,"15":0.05,"5":0.03}}}}}}

tail -n 10 logstash-plain.log
[2018-12-10T01:21:20,129][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"filebeat-2019.05.12", :_type=>"_doc", :_routing=>nil}, #LogStash::Event:0x41011ed2], :response=>{"create"=>{"_index"=>"filebeat-2019.05.12", "_type"=>"_doc", "id"=>"AWeWxvL2DlIGeKs_l69Z", "status"=>400, "error"=>{"type"=>"invalid_type_name_exception", "reason"=>"Document mapping type name can't start with ''"}}}}

I also upgrade kibana 4.4 to 7.0 but kibana service failed.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.