7.10 ECONNREFUSED on Kibana

Hi All,

I have a bit of a problem, I currently have a VM (Centos) running Elasticsearch 7.10, and a different VM (Centos ) running Kibana 7.10 and Nginx.

The error I'm getting is:

Dec 07 11:17:07 slazKIB02 kibana[9059]: {"type":"log","@timestamp":"2020-12-07T11:17:07Z","tags":["error","elasticsearch","data"],"pid":9059,"message":"[ConnectionErro>
Dec 07 11:17:09 slazKIB02 kibana[9059]: {"type":"log","@timestamp":"2020-12-07T11:17:09Z","tags":["error","elasticsearch","data"],"pid":9059,"message":"[ConnectionErro>
Dec 07 11:17:12 slazKIB02 kibana[9059]: {"type":"log","@timestamp":"2020-12-07T11:17:12Z","tags":["error","elasticsearch","data"],"pid":9059,"message":"[ConnectionErro>
Dec 07 11:17:14 slazKIB02 kibana[9059]: {"type":"log","@timestamp":"2020-12-07T11:17:14Z","tags":["error","elasticsearch","data"],"pid":9059,"message":"[ConnectionErro>
Dec 07 11:17:17 slazKIB02 kibana[9059]: {"type":"log","@timestamp":"2020-12-07T11:17:17Z","tags":["error","elasticsearch","data"],"pid":9059,"message":"[ConnectionErro>
Dec 07 11:17:19 slazKIB02 kibana[9059]: {"type":"log","@timestamp":"2020-12-07T11:17:19Z","tags":["error","elasticsearch","data"],"pid":9059,"message":"[ConnectionErro>
Dec 07 11:17:22 slazKIB02 kibana[9059]: {"type":"log","@timestamp":"2020-12-07T11:17:22Z","tags":["error","elasticsearch","data"],"pid":9059,"message":"[ConnectionErro>
Dec 07 11:17:23 slazKIB02 kibana[9059]: {"type":"log","@timestamp":"2020-12-07T11:17:23Z","tags":["warning","elasticsearch","monitoring"],"pid":9059,"message":"Unable >
Dec 07 11:17:23 slazKIB02 kibana[9059]: {"type":"log","@timestamp":"2020-12-07T11:17:23Z","tags":["warning","elasticsearch","monitoring"],"pid":9059,"message":"No livi>
Dec 07 11:17:23 slazKIB02 kibana[9059]: {"type":"log","@timestamp":"2020-12-07T11:17:23Z","tags":["warning","plugins","licensing"],"pid":9059,"message":"License inform>
lines 1-19/19 (END)
I've opened all the ports on both VM's but still don't have any luck.

My Kibana.yml looks like this:

server.port: 5601
server.host: "localhost"
elasticsearch.hosts: ["http://XXXXXXXXXX-XXX.XXXXXXXXXX.XXXXXXXX.azure.com"]
elasticsearch.ssl.verificationMode: none

My Elasticsearch.yml looks like:

cluster.name: ElasticStack
node.name: ${HOSTNAME}
#node.attr.rack: r1
#xpack.security.authc.api_key.enabled
#xpack.monitoring.collection.enabled: true
xpack.security.enabled: true

path.data: /var/lib/elasticsearch
path.logs: /var/log/elasticsearch

transport.host: localhost
transport.tcp.port: 9300
network.host: 0.0.0.0
http.port: 9200

When I open up the URL of my Kibana VM it just says:
"Kibana server is not ready yet"

Please help a friend out :slight_smile:

Hey @CLeO_Zap,

I'm not a VM expert, but did you make sure you can really access ES URL from the Kibana VM (e.g. via cURL)? Just to make sure it's no a general VM Network setup issue that may not be related to the Elastic Stack itself.

Best,
Oleg

Hi @azasypkin,

Yeah I've double checked and I previously had a system like this working over 2 different VM's. It's only when I've upgraded to 7.10 that it's decided to give up.

Thank you for your reply

Okay, good. Then few more questions:

  • Can you enable verbose logging (logging.verbose: true) and share the full log? Just in case there is another error happens before the Connection Error.

  • I see you have security enabled, but you didn't mentioned elasticsearch.username/password in your kibana.yml, did you just omit them, or maybe you pass these values through other means (env variable or keystore), or you have an authenticating proxy in front of ES?

The status of Kibana with logging.verbose: true:

Dec 07 12:21:02 slazKIB02 kibana[16965]: {"type":"log","@timestamp":"2020-12-07T12:21:02Z","tags":["debug","plugins","canvas"],"pid":16965,"message":"Initializing plug>
Dec 07 12:21:02 slazKIB02 kibana[16965]: {"type":"log","@timestamp":"2020-12-07T12:21:02Z","tags":["debug","legacy-service"],"pid":16965,"message":"setting up legacy s>
Dec 07 12:21:02 slazKIB02 kibana[16965]: {"type":"log","@timestamp":"2020-12-07T12:21:02Z","tags":["debug","core-app"],"pid":16965,"message":"Setting up core app."}
Dec 07 12:21:02 slazKIB02 kibana[16965]: {"type":"log","@timestamp":"2020-12-07T12:21:02Z","tags":["debug","root"],"pid":16965,"message":"starting root"}
Dec 07 12:21:02 slazKIB02 kibana[16965]: {"type":"log","@timestamp":"2020-12-07T12:21:02Z","tags":["debug","server"],"pid":16965,"message":"starting server"}
Dec 07 12:21:02 slazKIB02 kibana[16965]: {"type":"log","@timestamp":"2020-12-07T12:21:02Z","tags":["debug","savedobjects-service"],"pid":16965,"message":"Starting Save>
Dec 07 12:21:02 slazKIB02 kibana[16965]: {"type":"log","@timestamp":"2020-12-07T12:21:02Z","tags":["debug","config"],"pid":16965,"message":"Marking config path as hand>
Dec 07 12:21:02 slazKIB02 kibana[16965]: {"type":"log","@timestamp":"2020-12-07T12:21:02Z","tags":["info","savedobjects-service"],"pid":16965,"message":"Waiting until >
Dec 07 12:21:02 slazKIB02 kibana[16965]: {"type":"log","@timestamp":"2020-12-07T12:21:02Z","tags":["debug","status"],"pid":16965,"status":{"level":"unavailable","summa>
Dec 07 12:21:02 slazKIB02 kibana[16965]: {"type":"log","@timestamp":"2020-12-07T12:21:02Z","tags":["error","elasticsearch","monitoring"],"pid":16965,"message":"Request>

Dec 07 12:24:57 slazKIB02 kibana[16965]: {"type":"log","@timestamp":"2020-12-07T12:24:57Z","tags":["debug","metrics"],"pid":16965,"message":"Refreshing metrics"}
Dec 07 12:24:58 slazKIB02 kibana[16965]: {"type":"log","@timestamp":"2020-12-07T12:24:58Z","tags":["error","elasticsearch","data"],"pid":16965,"message":"[ConnectionEr>
Dec 07 12:25:00 slazKIB02 kibana[16965]: {"type":"log","@timestamp":"2020-12-07T12:25:00Z","tags":["error","elasticsearch","data"],"pid":16965,"message":"[ConnectionEr>
Dec 07 12:25:02 slazKIB02 kibana[16965]: {"type":"log","@timestamp":"2020-12-07T12:25:02Z","tags":["debug","metrics"],"pid":16965,"message":"Refreshing metrics"}
Dec 07 12:25:02 slazKIB02 kibana[16965]: {"type":"log","@timestamp":"2020-12-07T12:25:02Z","tags":["warning","elasticsearch","monitoring"],"pid":16965,"message":"Unabl>
Dec 07 12:25:02 slazKIB02 kibana[16965]: {"type":"log","@timestamp":"2020-12-07T12:25:02Z","tags":["warning","elasticsearch","monitoring"],"pid":16965,"message":"No li>
Dec 07 12:25:02 slazKIB02 kibana[16965]: {"type":"log","@timestamp":"2020-12-07T12:25:02Z","tags":["warning","plugins","licensing"],"pid":16965,"message":"License info>
Dec 07 12:25:03 slazKIB02 kibana[16965]: {"type":"log","@timestamp":"2020-12-07T12:25:03Z","tags":["error","elasticsearch","data"],"pid":16965,"message":"[ConnectionEr>
Dec 07 12:25:05 slazKIB02 kibana[16965]: {"type":"log","@timestamp":"2020-12-07T12:25:05Z","tags":["error","elasticsearch","data"],"pid":16965,"message":"[ConnectionEr>
Dec 07 12:25:07 slazKIB02 kibana[16965]: {"type":"log","@timestamp":"2020-12-07T12:25:07Z","tags":["debug","metrics"],"pid":16965,"message":"Refreshing metrics"}

And yes, I just left out the password info, thank you again

Hmm, looks good to me. By the way in http://XXXXXXXXXX-XXX.XXXXXXXXXX.XXXXXXXX.azure.com", did you forget to add 9200 port or you have some sort of 80 -> 9200 forwarding configured on Azure?

So If 1) this URL is correct and 2) you're sure you don't need 9200 port and 3) you can cURL ES using this URL from Kibana VM (means that Azure firewall is properly configured and allows required ports) then I don't really know what can be wrong here.

I'll get back if I have more ideas :slight_smile:

@azasypkin
I'm sorry, I feel like such an idiot, because I had my config files on GitHub I used them directly to change the config files on this 'updated' system. The problem was I didn't put ':9200' at the end of the URL. Thank you for your time

1 Like

No problem, glad you sorted this out!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.