Logstash's OutOfMemoryError

hi everyone,
after I execute logstash orders the console show error as bellows:

""Ruby-0-Thread-6: /usr/pds/logstash-2.2.2/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.5.1-java/lib/logstash/outputs/elasticsearch/buffer.rb:78" java.lang.OutOfMemoryError: Java heap space
Exception in pipelineworker, the pipeline stopped processing new events, please check your filter configuration and restart Logstash. {"exception"=>java.lang.OutOfMemoryError: Java heap space, "backtrace"=>[], :level=>:error}
Error: Your application used more memory than the safety cap of 1G.
Specify -J-Xmx####m to increase it (#### = cap size in MB).
Specify -w for full OutOfMemoryError stack trace"

I using the logstash jdbc input conf as bellows:
input {
jdbc {
jdbc_driver_library => "/opt/pds/lib/postgresql-9.3-1101-jdbc41.jar"
jdbc_driver_class => "org.postgresql.Driver"
jdbc_connection_string => "jdbc:postgresql://10.2.5.116:5432/postgres"
jdbc_user => "postgres"
jdbc_password => "postgres"
jdbc_fetch_size => 10000
statement => "SELECT * from postgres_log"
}
}
output{
stdout { codec => rubydebug }
elasticsearch {
document_id => "%{session_line_num}"
hosts => ["127.0.0.1"]
index => "postgres_log"
}

}
when I add the limit 1000000 of the statement,it's runing ok!BTW,the table has count is 3561859.

Best Regards,
Levi

How can I solve above the problem? thanks!