How can I configure Elasticsearch to use our SAN drive?

HI,

I have a ELK stack setup on 2 servers, i.e. Logstash on one server, and Elastic and Kibana on other.

First, I need to know where Elastic search stores are incoming logs and how can I change that path. I want to use my SAN drive for storing logs.

See https://www.elastic.co/guide/en/elasticsearch/reference/current/setup-dir-layout.html

Thanks Mark for the reply. But my question is same. Still I couldn't figure out that how can I configure my SAN storage to store the incoming logs from my Windows machines? I am a newbie to ELK stack; therefore, maybe I am asking a stupid question but please bear with me.

Second, once someone let me know, how and where can I configure the path, in which format it will take the path, volume or UNC path?

Thanks for understanding.

It's not clear what you are asking then.

Do you want to store the indexed logs in ES and then store the ES data on the SAN? Do you want to store the ES logs on the SAN? If not, what exactly.

We have have setup ELK stack on two machines. One is for Logstash and other is for Elasticsearch and Kibana. Now as logstash will forward all logs to Elasticsearch, right? If yes, that machine's HDD will be full very quickly; therefore, we want to use our SAN storage. I want to store that incoming logs to that SAN storage. If it is called indexed logs, yes, I want to store those indexed logs to external SAN storage.

Ok.

Then, per that link I posted, you set path.data to whatever the mount is, that's done in elasticsearch.yml.

I am not a linux/unix guy and that yml file freaks me out :grin:

Could you please let me know, if my I want to store the indexed data on \servername\abc. How can I write it in yml file?

path.data: \servername\abc

Thanks...

Btw, forward / or back slash \ ?

So, I figured out the right syntax which is as follow:

path.data: \\servername\folder

It's working like a charm...