"socket hang up"

Hi all getting some strange timeouts. I have serveral kibana instances. When I go to the management tab and refresh the fields I the the following logs messages

{"type":"response","@timestamp":"2017-05-31T14:17:04Z","tags":,"pid":25970,"method":"get","statusCode":200,"req":{"url":"/api/kibana/logstash-/field_capabilities","method":"get","headers":{"connection":"upgrade","host":"kibana06-vlp.du.edu","x-real-ip":"130.253.15.45, 130.253.15.45","x-forwarded-for":"130.253.15.45","x-forwarded-proto":"https","accept":"application/json, text/plain, /","kbn-version":"5.4.0","user-agent":"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_12_4) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/58.0.3029.110 Safari/537.36","referer":"https://kibana06-vlp.du.edu/app/kibana","accept-encoding":"gzip, deflate, sdch, br","accept-language":"en-US,en;q=0.8"},"remoteAddress":"127.0.0.1","userAgent":"127.0.0.1","referer":"https://kibana06-vlp.du.edu/app/kibana"},"res":{"statusCode":200,"responseTime":120024,"contentLength":9},"message":"GET /api/kibana/logstash-/field_capabilities 200 120024ms - 9.0B"}
{"type":"log","@timestamp":"2017-05-31T14:19:13Z","tags":["error","elasticsearch","admin"],"pid":25970,"message":"Request error, retrying\nGET http://kibana06-vlp.du.edu:9200/_nodes?filter_path=nodes.*.version%2Cnodes.*.http.publish_address%2Cnodes.*.ip => socket hang up"}
{"type":"log","@timestamp":"2017-05-31T14:19:30Z","tags":["status","plugin:elasticsearch@5.4.0","info"],"pid":25970,"state":"green","message":"Status changed from red to green - Kibana index ready","prevState":"red","prevMsg":{"code":"ECONNRESET"}}
{"type":"log","@timestamp":"2017-05-31T14:19:30Z","tags":["status","ui settings","info"],"pid":25970,"state":"green","message":"Status changed from red to green - Ready","prevState":"red","prevMsg":"Elasticsearch plugin is red"}

Its not elasticsearch as I can do searches using curl no issue. I deleted the indices associated with the kibana instance and attempted to recreate but still get the messages. I have tried backing out to the previous version of kibana 5.3 but that didn't work. I am currently at 6.4.0 of Kibana

Any ideas?

Are the mappings for your index large? The request may be taking longer than kibana's configured request timeout.
Bumping elasticsearch.requestTimeout in kibana.yml is worth a try.

If that doesn't work, are there any proxies between elasticsearch and kibana? It might be worth checking if the connection works directly, there could be a timeout on the proxy.

I bumped up the Timeout to some crazy high number and that didn't work. I have 2477 fields. I also put X11 and firefox and went local to by pass nginx.
One thing I just tried is pointed the kibana to a new indices and on startup the I see the indices created. But get the same sort of error messages:

{"type":"response","@timestamp":"2017-05-31T18:38:41Z","tags":[],"pid":1831,"method":"get","statusCode":200,"req":{"url":"/elasticsearch/logstash-/_mapping/field/?=1496255921770&ignore_unavailable=false&allow_no_indices=false&include_defaults=true","method":"get","headers":{"host":"localhost:5601","user-agent":"Mozilla/5.0 (X11; Linux x86_64; rv:52.0) Gecko/20100101 Firefox/52.0","accept":"application/json, text/plain, /","accept-language":"en-US,en;q=0.5","accept-encoding":"gzip, deflate","kbn-version":"5.4.0","referer":"http://localhost:5601/app/kibana","connection":"keep-alive"},"remoteAddress":"127.0.0.1","userAgent":"127.0.0.1","referer":"http://localhost:5601/app/kibana"},"res":{"statusCode":200,"responseTime":5354,"contentLength":9},"message":"GET /elasticsearch/logstash-/_mapping/field/?=1496255921770&ignore_unavailable=false&allow_no_indices=false&include_defaults=true 200 5354ms - 9.0B"}
{"type":"log","@timestamp":"2017-05-31T18:39:17Z","tags":["error","elasticsearch","data"],"pid":1831,"message":"Request error, retrying\nPOST http://kibana06-vlp.du.edu:9200/logstash-/_field_stats?fields=&allow_no_indices=false => socket hang up"}
{"type":"log","@timestamp":"2017-05-31T18:39:19Z","tags":["error","elasticsearch","admin"],"pid":1831,"message":"Request error, retrying\nGET http://kibana06-vlp.du.edu:9200/_nodes?filter_path=nodes..version%2Cnodes..http.publish_address%2Cnodes..ip => socket hang up"}
{"type":"log","@timestamp":"2017-05-31T18:39:47Z","tags":["error","elasticsearch","data"],"pid":1831,"message":"Request error, retrying\nPOST http://kibana06-vlp.du.edu:9200/logstash-
/_field_stats?fields=&allow_no_indices=false => socket hang up"}
{"type":"log","@timestamp":"2017-05-31T18:39:49Z","tags":["error","elasticsearch","admin"],"pid":1831,"message":"Request error, retrying\nGET http://kibana06-vlp.du.edu:9200/_nodes?filter_path=nodes.
.version%2Cnodes..http.publish_address%2Cnodes..ip => socket hang up"}
{"type":"log","@timestamp":"2017-05-31T18:40:17Z","tags":["error","elasticsearch","data"],"pid":1831,"message":"Request error, retrying\nPOST http://kibana06-vlp.du.edu:9200/logstash-/_field_stats?fields=&allow_no_indices=false => socket hang up"}
{"type":"log","@timestamp":"2017-05-31T18:40:19Z","tags":["error","elasticsearch","admin"],"pid":1831,"message":"Request error, retrying\nGET http://kibana06-vlp.du.edu:9200/_nodes?filter_path=nodes..version%2Cnodes..http.publish_address%2Cnodes..ip => socket hang up"}
{"type":"log","@timestamp":"2017-05-31T18:40:47Z","tags":["error","elasticsearch","data"],"pid":1831,"message":"Request complete with error\nPOST http://kibana06-vlp.du.edu:9200/logstash-
/_field_stats?fields=&allow_no_indices=false => socket hang up"}
{"type":"response","@timestamp":"2017-05-31T18:38:47Z","tags":[],"pid":1831,"method":"get","statusCode":503,"req":{"url":"/api/kibana/logstash-
/field_capabilities","method":"get","headers":{"host":"localhost:5601","user-agent":"Mozilla/5.0 (X11; Linux x86_64; rv:52.0) Gecko/20100101 Firefox/52.0","accept":"application/json, text/plain, /","accept-language":"en-US,en;q=0.5","accept-encoding":"gzip, deflate","kbn-version":"5.4.0","referer":"http://localhost:5601/app/kibana","connection":"keep-alive"},"remoteAddress":"127.0.0.1","userAgent":"127.0.0.1","referer":"http://localhost:5601/app/kibana"},"res":{"statusCode":503,"responseTime":120058,"contentLength":9},"message":"GET /api/kibana/logstash-/field_capabilities 503 120058ms - 9.0B"}
{"type":"log","@timestamp":"2017-05-31T18:40:49Z","tags":["error","elasticsearch","admin"],"pid":1831,"message":"Request complete with error\nGET http://kibana06-vlp.du.edu:9200/_nodes?filter_path=nodes.
.version%2Cnodes..http.publish_address%2Cnodes..ip => socket hang up"}
{"type":"log","@timestamp":"2017-05-31T18:40:49Z","tags":["status","plugin:elasticsearch@5.4.0","error"],"pid":1831,"state":"red","message":"Status changed from green to red - Error: socket hang up","prevState":"green","prevMsg":"Kibana index ready"}
{"type":"log","@timestamp":"2017-05-31T18:40:49Z","tags":["status","ui settings","error"],"pid":1831,"state":"red","message":"Status changed from green to red - Elasticsearch plugin is red","prevState":"green","prevMsg":"Ready"}
{"type":"log","@timestamp":"2017-05-31T18:41:21Z","tags":["error","elasticsearch","admin"],"pid":1831,"message":"Request error, retrying\nGET http://kibana06-vlp.du.edu:9200/_nodes?filter_path=nodes..version%2Cnodes..http.publish_address%2Cnodes..ip => socket hang up"}
{"type":"log","@timestamp":"2017-05-31T18:41:51Z","tags":["error","elasticsearch","admin"],"pid":1831,"message":"Request error, retrying\nGET http://kibana06-vlp.du.edu:9200/_nodes?filter_path=nodes.
.version%2Cnodes..http.publish_address%2Cnodes..ip => socket hang up"}
{"type":"log","@timestamp":"2017-05-31T18:42:21Z","tags":["error","elasticsearch","admin"],"pid":1831,"message":"Request error, retrying\nGET http://kibana06-vlp.du.edu:9200/_nodes?filter_path=nodes..version%2Cnodes..http.publish_address%2Cnodes.*.ip => socket hang up"}
{"type":"log","@timestamp":"2017-05-31T18:42:42Z","tags":["status","plugin:elasticsearch@5.4.0","info"],"pid":1831,"state":"green","message":"Status changed from red to green - Kibana index ready","prevState":"red","prevMsg":{"code":"ECONNRESET"}}
{"type":"log","@timestamp":"2017-05-31T18:42:42Z","tags":["status","ui settings","info"],"pid":1831,"state":"green","message":"Status changed from red to green - Ready","prevState":"red","prevMsg":"Elasticsearch plugin is red"}

Another clue to my issue is if I run :
curl 'http://localhost:5601/api/kibana/logstash-2017.05.31/field_capabilities'
it fails with "Empty reply from server"
but if I run
curl 'http://localhost:5601/api/kibana/logstash-2017.05.29/field_capabilities'
This works so I am thinking I have an issue with my logstash indices from 30 and today.

Ok I found the fix for this and its a bit odd but who cares.
I Put the following lines into my elasticsearch.yml file:
> http.cors.enabled: true
> http.cors.allow-origin: "*"
> http.max_header_size: 16kb

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.