Logstash -Using Schedular with JDBC Input plugin results in Ram being locked for scheduler and high CPU usage

Hi ,
Using JDBC Input plugin with Schedule option , results in
a. RAM(around 150+ MB) being Locked for the process.
b. High CPU Usage.

Version details

  1. logstash-2.2.0

What does your configuration look like?

here it goes-

input{

	jdbc {
		  
			jdbc_driver_library => "<path to jar dolder>/ojdbc6-11.2.0.3.0.jar"
			jdbc_driver_class => "Java::oracle.jdbc.driver.OracleDriver"
			jdbc_connection_string => "jdbc:oracle:thin:@<oracle db ip>:<oracle db port>/<db service_name>"
			jdbc_user => "<user name>"
			jdbc_password => "<password>"
            schedule => "0 0/20 * * * ?"
			statement => "select ID 	as	id	,name as person name from person	"
	 }

}

filter{

mutate{
convert =>{"id" => "integer"}

}
// mutate a few other columns needed as per business case

}

output {
#stdout{ codec => json_lines}
elasticsearch {
hosts => [":"]
action => "index"
index => "person_details"
document_type => "person_details"
document_id => "%{id}"
}
}