Hi everybody
I have an issue when dealing with large table in postgresql. I have a table with about 1 millions rows, each row contains some text, about half of A4 page of length. I want to index this table into elasticsearch. But i always got java.lang.OutOfMemoryError: Java heap space
. I increased jvm heap size to 4Gb and i cant increase more. I also add jdbc_page_size options to my logstash config file but it doesn't work.
input { jdbc { # Postgres jdbc connection string to our database, mydb jdbc_connection_string => "jdbc:postgresql://localhost:5432/jmdb" # The user we wish to execute our statement as jdbc_user => "xxx" # The path to our downloaded jdbc driver jdbc_driver_library => "${HOME}/postgresql-42.2.8.jar" # The name of the driver class for Postgresql jdbc_driver_class => "org.postgresql.Driver" jdbc_password => "xxx" jdbc_paging_enabled => true jdbc_page_size => 10000 statement_filepath => "${INDEXING_DIRECTORY}/decision_index.sql" type => "decision" } } output { elasticsearch { index => "decision" } }
Someone know how to solve this situation. Or maybe a way to monitor jdbc_page_size to know how many jdbc_page_size i need to not dump java heap size ?
Thank you alot.