Problem with Elastic Node

Hi,

I have a strange behavior with one of my elasitcsearch nodes on the Elastic Cluster.
I have an elastic cluster using 7 nodes that are different hosts. In fact i have 6 linux machines and 1 with windows.

When i am running on DEV Tools on Kibana

GET /_cat/nodes
135.238.239.48 24 98 3 0.22 0.25 0.24 mdi - xh-fr-elastic-1
10.158.67.175 14 99 2 0.10 0.42 1.65 mdi - xh-gr-elastic-1
10.158.67.107 19 87 2 mdi - xh-gr-elastic-3
10.159.166.9 66 99 3 4.05 5.20 3.53 mdi * xh-gr-elastic-2
151.98.17.60 37 0 2 0.66 0.74 0.81 mdi - xh-it-elastic-1
151.98.17.34 14 97 8 0.88 0.83 0.77 mdi - xh-it-elastic-2
135.238.239.132 37 36 5 3.09 3.38 3.43 mdi - xh-fr-elastic-2

I am getting all of 7 elastic nodes.

But on the Monitoring Tab 1 of my nodes seems to be offline.

Do you have any idea what is going wrong?

Best Regards,
Thanos

What do the logs on that node show?

How i can see them?

Usually it's in /var/log/elasticsearch/elasticsearch.log.

These all i got from rebooting the elasticsearch service

2019-05-31T10:48:29,645][INFO ][o.e.p.PluginsService ] [xh-it-elastic-1] loaded module [x-pack-sql]
[2019-05-31T10:48:29,645][INFO ][o.e.p.PluginsService ] [xh-it-elastic-1] loaded module [x-pack-watcher]
[2019-05-31T10:48:29,646][INFO ][o.e.p.PluginsService ] [xh-it-elastic-1] no plugins loaded
[2019-05-31T10:48:33,996][INFO ][o.e.x.s.a.s.FileRolesStore] [xh-it-elastic-1] parsed [0] roles from file [/etc/elasticsearch/roles.yml]
[2019-05-31T10:48:34,626][INFO ][o.e.x.m.p.l.CppLogMessageHandler] [xh-it-elastic-1] [controller/28742] [Main.cc@109] controller (64 bit): Version 7.1.1 (Build fd619a36eb77df) Copyright (c) 2019 Elasticsearch BV
[2019-05-31T10:48:35,039][DEBUG][o.e.a.ActionModule ] [xh-it-elastic-1] Using REST wrapper from plugin org.elasticsearch.xpack.security.Security
[2019-05-31T10:48:38,012][INFO ][o.e.d.DiscoveryModule ] [xh-it-elastic-1] using discovery type [zen] and seed hosts providers [settings]
[2019-05-31T10:48:39,311][INFO ][o.e.n.Node ] [xh-it-elastic-1] initialized
[2019-05-31T10:48:39,313][INFO ][o.e.n.Node ] [xh-it-elastic-1] starting ...
[2019-05-31T10:48:39,483][INFO ][o.e.t.TransportService ] [xh-it-elastic-1] publish_address {151.98.17.60:9300}, bound_addresses {[::]:9300}
[2019-05-31T10:48:39,492][INFO ][o.e.b.BootstrapChecks ] [xh-it-elastic-1] bound or publishing to a non-loopback address, enforcing bootstrap checks
[2019-05-31T10:48:39,556][INFO ][o.e.c.c.Coordinator ] [xh-it-elastic-1] cluster UUID [EhAOHSiYSYSddPezc3yaeQ]
[2019-05-31T10:48:44,055][INFO ][o.e.c.s.ClusterApplierService] [xh-it-elastic-1] master node changed {previous , current [{xh-gr-elastic-3}{Vwfbqe-rTeCaWtWG5zlNgA}{qkwaJhOHTwu7Jqmr5h-rUw}{10.158.67.107}{10.158.67.107:9300}{ml.machine_memory=17179332608, ml.max_open_jobs=20, xpack.installed=true}]}, added {{xh-gr-elastic-2}{OETgEHqTR9Ku30WwPWADyg}{QZG2JlyYR0iiA8KJw0euwg}{10.159.166.9}{10.159.166.9:9300}{ml.machine_memory=269932404736, ml.max_open_jobs=20, xpack.installed=true},{xh-gr-elastic-3}{Vwfbqe-rTeCaWtWG5zlNgA}{qkwaJhOHTwu7Jqmr5h-rUw}{10.158.67.107}{10.158.67.107:9300}{ml.machine_memory=17179332608, ml.max_open_jobs=20, xpack.installed=true},{xh-it-elastic-2}{_BIH8swLQz6rJa_ImpYJuA}{UkncF8zPShSpNa0BTkwPSg}{151.98.17.34}{151.98.17.34:9300}{ml.machine_memory=8186564608, ml.max_open_jobs=20, xpack.installed=true},{xh-fr-elastic-2}{14b6D9VCR36schkKD3k74A}{eRwAvrHWRzuTMyKCXZPGFg}{135.238.239.132}{135.238.239.132:9300}{ml.machine_memory=269930721280, ml.max_open_jobs=20, xpack.installed=true},{xh-fr-elastic-1}{isTX9Dk7SMSaP3GARPtU9A}{UhTAE1ctSZmCwlehGe1Krg}{135.238.239.48}{135.238.239.48:9300}{ml.machine_memory=16654970880, ml.max_open_jobs=20, xpack.installed=true},{xh-gr-elastic-1}{6q5asfwjQ_eoI3xkl2-JXg}{C6DNUc3CSxKKpE_HkIVpgA}{10.158.67.175}{10.158.67.175:9300}{ml.machine_memory=16654872576, ml.max_open_jobs=20, xpack.installed=true},}, term: 2757, version: 221080, reason: ApplyCommitRequest{term=2757, version=221080, sourceNode={xh-gr-elastic-3}{Vwfbqe-rTeCaWtWG5zlNgA}{qkwaJhOHTwu7Jqmr5h-rUw}{10.158.67.107}{10.158.67.107:9300}{ml.machine_memory=17179332608, ml.max_open_jobs=20, xpack.installed=true}}
[2019-05-31T10:48:44,067][INFO ][o.e.c.s.ClusterSettings ] [xh-it-elastic-1] updating [xpack.monitoring.collection.enabled] from [false] to [true]
[2019-05-31T10:48:44,359][WARN ][o.e.x.s.a.s.m.NativeRoleMappingStore] [xh-it-elastic-1] Failed to clear cache for realms []
[2019-05-31T10:48:44,362][INFO ][o.e.x.s.a.TokenService ] [xh-it-elastic-1] refresh keys
[2019-05-31T10:48:44,672][INFO ][o.e.x.s.a.TokenService ] [xh-it-elastic-1] refreshed keys
[2019-05-31T10:48:44,844][INFO ][o.e.l.LicenseService ] [xh-it-elastic-1] license [b76576f0-0b73-475d-bdf9-69c0453c1ee5] mode [basic] - valid
[2019-05-31T10:48:44,891][INFO ][o.e.h.AbstractHttpServerTransport] [xh-it-elastic-1] publish_address {151.98.17.60:9200}, bound_addresses {[::]:9200}
[2019-05-31T10:48:44,892][INFO ][o.e.n.Node ] [xh-it-elastic-1] started
[2019-05-31T10:48:48,258][INFO ][o.e.m.j.JvmGcMonitorService] [xh-it-elastic-1] [gc][young][8][3] duration [1s], collections [1]/[1.9s], total [1s]/[1.3s], memory [1.8gb]->[1.1gb]/[7.7gb], all_pools {[young] [1.6gb]->[35.8mb]/[1.8gb]}{[survivor] [173.5mb]->[232.9mb]/[232.9mb]}{[old] [0b]->[917mb]/[5.7gb]}
[2019-05-31T10:48:48,260][WARN ][o.e.m.j.JvmGcMonitorService] [xh-it-elastic-1] [gc][8] overhead, spent [1s] collecting in the last [1.9s]
[2019-05-31T10:49:02,271][INFO ][o.e.m.j.JvmGcMonitorService] [xh-it-elastic-1] [gc][22] overhead, spent [286ms] collecting in the last [1s]
[2019-05-31T10:49:10,278][INFO ][o.e.m.j.JvmGcMonitorService] [xh-it-elastic-1] [gc][30] overhead, spent [296ms] collecting in the last [1s]
[2019-05-31T10:49:27,298][INFO ][o.e.m.j.JvmGcMonitorService] [xh-it-elastic-1] [gc][47] overhead, spent [255ms] collecting in the last [1s]

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.