I have tried Elasticsearch + fs-river plugin to read the local directory
and file system. I have a file about 2.5 gb text file. While reading this
file, it gives error and dump the heap to elastic search folder. I have
started the es server with 6gb memory as given in elastic search
configuration.
i have tried to check the code in fs-river plugin, it load the file using
following code.
FileInputStream fileReader = new FileInputStream(file);
// write it to a byte[] using a buffer since we don't know the
exact
// image size
byte[] buffer = new byte[1024];
ByteArrayOutputStream bos = new ByteArrayOutputStream();
int i = 0;
while (-1 != (i = fileReader.read(buffer))) {
bos.write(buffer, 0, i);
}
byte[] data = bos.toByteArray();
fileReader.close();
bos.close();
Is there any way to parse the large text based file in ES server using
Fs-River? Did any one got success in loading heavy text files in ES server?
There are not lot of setting to do, its really easy. If i didn't set any
property. Please let me know.
I have tried Elasticsearch + fs-river plugin to read the local directory
and file system. I have a file about 2.5 gb text file. While reading this
file, it gives error and dump the heap to Elasticsearch folder. I have
started the es server with 6gb memory as given in Elasticsearch
configuration.
i have tried to check the code in fs-river plugin, it load the file using
following code.
FileInputStream fileReader = new FileInputStream(file);
// write it to a byte[] using a buffer since we don't know
the exact
// image size
byte buffer = new byte[1024];
ByteArrayOutputStream bos = new ByteArrayOutputStream();
int i = 0;
while (-1 != (i = fileReader.read(buffer))) {
bos.write(buffer, 0, i);
}
byte data = bos.toByteArray();
fileReader.close();
bos.close();
Is there any way to parse the large text based file in ES server using
Fs-River? Did any one got success in loading heavy text files in ES server?
There are not lot of setting to do, its really easy. If i didn't set any
property. Please let me know.
I have tried Elasticsearch + fs-river plugin to read the local directory and file system. I have a file about 2.5 gb text file. While reading this file, it gives error and dump the heap to Elasticsearch folder. I have started the es server with 6gb memory as given in Elasticsearch configuration.
i have tried to check the code in fs-river plugin, it load the file using following code.
FileInputStream fileReader = new FileInputStream(file);
// write it to a byte[] using a buffer since we don't know the exact
// image size
byte[] buffer = new byte[1024];
ByteArrayOutputStream bos = new ByteArrayOutputStream();
int i = 0;
while (-1 != (i = fileReader.read(buffer))) {
bos.write(buffer, 0, i);
}
byte[] data = bos.toByteArray();
fileReader.close();
bos.close();
Is there any way to parse the large text based file in ES server using Fs-River? Did any one got success in loading heavy text files in ES server?
There are not lot of setting to do, its really easy. If i didn't set any property. Please let me know.
The correct method is to write code for reading the file in a stream-like
manner and extract the relevant content into JSON documents for search
hits.
If not, you have to prepare the file and partition it into docs by a
domain specific parser, a task the fs river was not built for.
Jörg
On Sat, Jan 25, 2014 at 7:38 PM, <coder...@gmail.com <javascript:>> wrote:
Hello All,
I have tried Elasticsearch + fs-river plugin to read the local directory
and file system. I have a file about 2.5 gb text file. While reading this
file, it gives error and dump the heap to Elasticsearch folder. I have
started the es server with 6gb memory as given in Elasticsearch
configuration.
i have tried to check the code in fs-river plugin, it load the file using
following code.
FileInputStream fileReader = new FileInputStream(file);
// write it to a byte[] using a buffer since we don't know
the exact
// image size
byte buffer = new byte[1024];
ByteArrayOutputStream bos = new ByteArrayOutputStream();
int i = 0;
while (-1 != (i = fileReader.read(buffer))) {
bos.write(buffer, 0, i);
}
byte data = bos.toByteArray();
fileReader.close();
bos.close();
Is there any way to parse the large text based file in ES server using
Fs-River? Did any one got success in loading heavy text files in ES server?
There are not lot of setting to do, its really easy. If i didn't set any
property. Please let me know.
I have tried Elasticsearch + fs-river plugin to read the local directory and file system. I have a file about 2.5 gb text file. While reading this file, it gives error and dump the heap to Elasticsearch folder. I have started the es server with 6gb memory as given in Elasticsearch configuration.
i have tried to check the code in fs-river plugin, it load the file using following code.
FileInputStream fileReader = new FileInputStream(file);
// write it to a byte[] using a buffer since we don't know the exact
// image size
byte[] buffer = new byte[1024];
ByteArrayOutputStream bos = new ByteArrayOutputStream();
int i = 0;
while (-1 != (i = fileReader.read(buffer))) {
bos.write(buffer, 0, i);
}
byte[] data = bos.toByteArray();
fileReader.close();
bos.close();
Is there any way to parse the large text based file in ES server using Fs-River? Did any one got success in loading heavy text files in ES server?
There are not lot of setting to do, its really easy. If i didn't set any property. Please let me know.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.