Failed to accept a connection.java.io.IOException:Too many open files

Hello, I intend to write river es index, importing 500, running on the way, there will be java.io.Exception (too many files). There is no way to set when a certain number of documents, the index file will be merged, in elasticsearch.yml configuration file, do not know whether there is a similar set properties.
org.elasticsearch.index.engine.CreateFailedEngineException: [estestindex][2] Create failed for [estesttype#_3vBD9f0SkKWbqqlGiC9VA]
at org.elasticsearch.index.engine.robin.RobinEngine.create(RobinEngine.java:370)
at org.elasticsearch.index.shard.service.InternalIndexShard.create(InternalIndexShard.java:308)
at org.elasticsearch.action.bulk.TransportShardBulkAction.shardOperationOnPrimary(TransportShardBulkAction.java:164)
at org.elasticsearch.action.support.replication.TransportShardReplicationOperationAction$AsyncShardOperationAction.performOnPrimary(TransportShardReplicationOperationAction.java:532)
at org.elasticsearch.action.support.replication.TransportShardReplicationOperationAction$AsyncShardOperationAction$1.run(TransportShardReplicationOperationAction.java:430)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:619)
Caused by: java.io.FileNotFoundException: /home/fsp/es/elasticsearch3/elasticsearch-0.20.5/data/esZJW/nodes/0/indices/estestindex/2/index/_119a.fdt (Too many open files)
at java.io.RandomAccessFile.open(Native Method)
at java.io.RandomAccessFile.(RandomAccessFile.java:212)
at org.apache.lucene.store.FSDirectory$FSIndexOutput.(FSDirectory.java:441)
at org.apache.lucene.store.FSDirectory.createOutput(FSDirectory.java:306)
at org.apache.lucene.store.XNIOFSDirectory.createOutput(XNIOFSDirectory.java:48)
at org.elasticsearch.index.store.Store$StoreDirectory.createOutput(Store.java:487)
at org.elasticsearch.index.store.Store$StoreDirectory.createOutput(Store.java:459)
at org.apache.lucene.index.FieldsWriter.(FieldsWriter.java:83)
at org.apache.lucene.index.StoredFieldsWriter.initFieldsWriter(StoredFieldsWriter.java:64)
at org.apache.lucene.index.StoredFieldsWriter.finishDocument(StoredFieldsWriter.java:107)
at org.apache.lucene.index.StoredFieldsWriter$PerDoc.finish(StoredFieldsWriter.java:151)
at org.apache.lucene.index.DocumentsWriter$WaitQueue.writeDocument(DocumentsWriter.java:1404)
at org.apache.lucene.index.DocumentsWriter$WaitQueue.add(DocumentsWriter.java:1424)
at org.apache.lucene.index.DocumentsWriter.finishDocument(DocumentsWriter.java:1043)
at org.apache.lucene.index.DocumentsWriter.updateDocument(DocumentsWriter.java:772)
at org.apache.lucene.index.IndexWriter.addDocument(IndexWriter.java:2060)
at org.elasticsearch.index.engine.robin.RobinEngine.innerCreate(RobinEngine.java:470)
at org.elasticsearch.index.engine.robin.RobinEngine.create(RobinEngine.java:365)
... 7 more

The second question:
Question 2: my database prod_id as the primary key, I would like to _id specified as our the prod_id A of, I tried the following methods, but failed, does know _id how should I set?
This is my current mapping setting method:

XContentBuilder mappingBuilder = XContentFactory.jsonBuilder()
.startObject()
.startObject(inextype);
.startObject(“_id”).field(“path”,”prod_id”).endObject()
.startObject(“properties”)
.startObject(“prod_id”).field(“type”,”string”).endObject()
.startObject(“prod_name”).filed(“type”,”string”).endObject()
.endObject()
endObject()
.endObject();
PutMappingRequest mappingRequest=Requests.putMappingRequest(indexName).type(indexType).source(mappingBuilder );
client.admin().indeces().putMapping(mappingRequest).actionGet();