Trying to push 17GB folder size of different log file into ES

Trying to push 17GB folder size of different log file into ES 8GB centos machine.

What are the best way to adapt for this .
I did the following

Added /usr/share/elasticsearch/bin/elasticsearch ES_HEAP_SIZE = 2gb
Added elasticsearch.yml = index.codec: best_compression
/etc/sysconfig/elasticsearch MAX_OPEN_FILES=65535 MAX_LOCKED_MEMORY=unlimited
/etc/security/limits.conf elasticsearch - nofile 65535 elasticsearch - memlock unlimited

I am trying to push one entire log file into one document like that i need to push all the 17 GB data in ES under one index
Observation :
After some time ES status goes to red and data stop populating into ES from LS.
Kindly help in understanding the memory allocation in ES ??

Hello Devi,

I am trying to understand what are you trying to do. Are trying to index a single log file of 17gb???

Cheers

No its not single log file .Folder contains multiple sub directories and it stores different log file

Ok Devi, could you please elaborate what do you mean by:

To understand what's going on with your elasticsearch node, send some logs from elasticsearch from when it goes red.

Regards