Run ES on two servers and store the index in NAS

Hello All,
Is there a way to achieve the following approach?

I want to setup an High Availability on ES. I can setup an HAProxy on top of two different ES servers and route the traffic if one node is down. But my question is, If I store the Index data in a NAS drive (mounted across my two ES servers) can the ES 2 write the data from (2nd node) to the existing Index created by ES1 (First node)?

Example:

Filebeat -> ElasticSearch (Server 1) -> Index stored in NAS

if Server 1 goes down, I want to write the incoming logs to same index created in the above step

Filebeat -> ElasticSearch (Server 2) -> Use the above created Index and continue storing the data to NAS

Please let me know if it requires additional explanation.

Just use local disks.
Start 2 nodes. Elasticsearch will manage all that for you.

(Add a 3rd node is recommended for production)

can you elaborate "start 2 nodes" please? you mean I can use two different servers to run nodes locally and ES will distribute the load among them?

what happens when Node 1 goes down? the storage will not be available from Node 1 right?

Just try it.
As I said, elasticsearch will manage all that for you.

Is there any documentation available to start 2 ES nodes which you can share please?

Locally? For testing?

Or in production?

Anyway, I'd recommend reading this: https://www.elastic.co/guide/en/elasticsearch/reference/current/important-settings.html

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.