I'm trying to index 4000 documents, however I allways get an
OutOfMemory exception.
My simple code is a loop around :
public void indicizzaDocumento(String xpIndice, InputStream
xpContent, HashMap<String,Object> xpParametri) throws IOException {
HashMap<String,Object> mappa = (HashMap<String, Object>)
xpParametri.clone();
long inizio = new Date().getTime();
byte[] contiene = IOUtils.toByteArray(xpContent);
String contenuto = new String(JsonUtils.encode(contiene));
mappa.put(CONTENUTO, contenuto);
StringWriter sw = new StringWriter();
mapper.writeValue(sw, mappa);
String json = sw.getBuffer().toString();
client.prepareIndex(INDICE,TIPO,xpIndice).setSource(json).execute().actionGet();
long gap = new Date().getTime() - inizio;
System.out.println("millis for "+xpIndice+" "+gap);
}
What's wrong ?
I tried to profile the code with no relevant informations.
Tks
Tullio
I'm trying to index 4000 documents, however I allways get an
OutOfMemory exception.
My simple code is a loop around :
public void indicizzaDocumento(String xpIndice, InputStream
xpContent, HashMap<String,Object> xpParametri) throws IOException {
HashMap<String,Object> mappa = (HashMap<String, Object>)
xpParametri.clone();
long inizio = new Date().getTime();
byte contiene = IOUtils.toByteArray(xpContent);
String contenuto = new String(JsonUtils.encode(contiene));
mappa.put(CONTENUTO, contenuto);
StringWriter sw = new StringWriter();
mapper.writeValue(sw, mappa);
String json = sw.getBuffer().toString();
client.prepareIndex(INDICE,TIPO,xpIndice).setSource(json).execute().actionGet();
long gap = new Date().getTime() - inizio;
System.out.println("millis for "+xpIndice+" "+gap);
}
What's wrong ?
I tried to profile the code with no relevant informations.
Tks
Tullio
The size is more or less 1Mb each (mean).
Noting relevant I believe:
. shards 5
. replicas 1
. type local
. index.mapping.attachment.indexed_chars -1
Tks
Tullio
On Tuesday, June 26, 2012 12:17:39 PM UTC-4, tullio0106 wrote:
The size is more or less 1Mb each (mean).
Noting relevant I believe:
. shards 5
. replicas 1
. type local
. index.mapping.attachment.indexed_chars -1
Tks
Tullio
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.