While importing the table of size 9 GB in elasticsearch through Logstash i get the following error.
[2018-01-17T16:34:59,948][INFO ][logstash.inputs.jdbc ] (120.731000s) SELECT * from mis_monthly_work_statistics
[2018-01-17T16:34:59,963][WARN ][logstash.inputs.jdbc ] Exception when executing JDBC query {:exception=>#<Sequel::DatabaseError: Java::JavaLang::OutOfMemoryError: Java heap space>}
Error: Your application used more memory than the safety cap of 1G.
Specify -J-Xmx####M to increase it (#### = cap size in MB).
Specify -w for full java.lang.OutOfMemoryError: Java heap space stack trace
I have tried specifying -Xms & -Xmx in JVM option but an error is recieved as
Invalid initial heap size: -Xms12g
The specified size exceeds the maximum representable size.
JVM creation failed
I have also tried declaring LS_HEAP_SIZE=" -Xmx12g -Xms12g" in startup-options still no difference.