At the moment we have we have 3.3TB disk that hosts OS and ElasticSearch. Lately we are experiencing alot of issues with the filesystem. Host goes into fdisk, and until it's done can't access the OS. We are going to partition the disks, one small for OS, and second one for ESData. I was wondering, is there a way to distribute data folder on multiple smaller disks?
Yes, you can list multiple data paths in the ES configuration.
There's nothing unusual about a 3 TB volume. It shouldn't misbehave like that. I'd get to the bottom of that instead of trying to solve the problem by partitioning.
Oh that's interesting!! So how will ElasticSearch decide which data path it's going to write? I have one instance of ES, how exactly can I map multiple data paths?
3TB partition I believe is combined 12 smaller disks.
I have one instance of ES, how exactly can I map multiple data paths?
The path.data configuration setting accepts a list of directory names. From a quick search the documentation of this seems scarce but I know it has been discussed here before.
But again, why do think this is going to solve your problems? It sounds like you're having hardware issues or operating system bugs and a different partitioning scheme isn't likely to fix that.
At the moment I am working with some reclaimed HP DL380s that used to be used from splunk
Unfortunately, I don't manage the servers. I have to work with group that handles these servers, to give me root cause. Their perspective is, when a host goes down it shouldn't take long time to rebuild the host. They can replace disk for a particular data path and bring up the node.
Are there any recommendations for CentOS 6.x particaluarly for ElasticSearch that need to be tweaked at the OS level, specifically on disk handling?
I am going to go with multi data path configuration, where I can use multiple mounts with smaller disks. Unless, someone has had issues with this, please let me know.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.