I have tried increasing the size of file in Bulk Import. I can successfully import Json File up to 160 MB. All the files above the size are not indexed and is ignored by elasticsearch withour giving an error.
How can I send the 1 GB big log data at once to elasticsearch. I cannot break the log file to small piece and send it to Elasticsearch, as it is created automatically by a program.
Thank you for a quick Reply. As I Early mentioned, the log file is written in a format for a Bulk import.
{ "index" : { "_index" : "test", "_type" : "type1" } }
{ "field1" : "value1" }
{ "index" : { "_index" : "test", "_type" : "type1" } }
{ "field1" : "value2" }
If I use Logstash it must ignore the every odd Lines { "index" : { "_index" : "test", "_type" : "type1" } }.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.