Error with Elasticsearch River SQL Server integration

I am getting the below error

SQLException: Connection is read-only. Queries leading to data modification are not allowed.
at org.xbib.elasticsearch.river.jdbc.strategy.simple.SimpleRiverSource.fetch(SimpleRiverSource.java:353)
at org.xbib.elasticsearch.river.jdbc.strategy.simple.SimpleRiverFlow.fetch(SimpleRiverFlow.java:226)
at org.xbib.elasticsearch.river.jdbc.strategy.simple.SimpleRiverFlow.execute(SimpleRiverFlow.java:152)
at org.xbib.elasticsearch.plugin.jdbc.RiverPipeline.request(RiverPipeline.java:88)
at org.xbib.elasticsearch.plugin.jdbc.RiverPipeline.call(RiverPipeline.java:66)
at org.xbib.elasticsearch.plugin.jdbc.RiverPipeline.call(RiverPipeline.java:30)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.sql.SQLException: Connection is read-only. Queries leading to data modification are not allowed.
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1056)
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:957)
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:927)
at com.mysql.jdbc.StatementImpl.executeUpdate(StatementImpl.java:1516)
at com.mysql.jdbc.StatementImpl.executeUpdate(StatementImpl.java:1485)
at com.mysql.jdbc.RowDataDynamic.close(RowDataDynamic.java:191)
at com.mysql.jdbc.ResultSetImpl.realClose(ResultSetImpl.java:7466)
at com.mysql.jdbc.ResultSetImpl.close(ResultSetImpl.java:881)
at org.xbib.elasticsearch.river.jdbc.strategy.simple.SimpleRiverSource.close(SimpleRiverSource.java:842)
at org.xbib.elasticsearch.river.jdbc.strategy.simple.SimpleRiverSource.execute(SimpleRiverSource.java:422)
at org.xbib.elasticsearch.river.jdbc.strategy.simple.SimpleRiverSource.fetch(SimpleRiverSource.java:332)

River Plugin used

plugin=org.xbib.elasticsearch.plugin.jdbc.river.JDBCRiverPlugin
version=1.5.0.5
I am running an Select Query to fetch data from SQL Server

I am using the sql server connector - sqljdbc4.jar

Please help me with the error

You should look at using the jdbc input plugin for Logstash, rivers are deprecated.

Hello Mark
Thanks for the reply. Yes river is deprecated but after 1.5. Now the issue is river is working fine for most of my select query, but one which is creating an issue. In case you want I can share the query.

Thanks
Senz

JDBC importer is not deprecated. It's a community contribution.

For Elasticsearch 1.5 I still support the full river API of JDBC importer/river. It's the same codebase.

The JDBC importer/river uses read-only connections by default. This is for security and performance. The query you are using is not reading but also writing. In that case, you must switch the SQL statement configuration to a callable statement. More information for callable statements at https://github.com/jprante/elasticsearch-jdbc#stored-procedures-or-callable-statements

You can also open issues at https://github.com/jprante/elasticsearch-jdbc/issues/ if you need help for JDBC importer/river.

Mark I tried Jdbc input plugin and i followed the below link

Now everything seems correct as I am getting the message "Logstash startup completed"

But nothing seems to happen ... no index created in elasticsearch. I am pasting the conf file here please advice

jdbc {
    jdbc_connection_string => "jdbc:sqlserver://xxx.xxx.xxx.xxx:1433;databaseName=EPS"
    jdbc_user => "myuser"
	jdbc_password => "mypassword"
    jdbc_validate_connection => true
    jdbc_driver_library => "/opt/logstash-1.5.4/bin/sqljdbc4-3.0.jar"
	jdbc_driver_class => "com.microsoft.sqlserver.jdbc.sqlserverdriver"
	schedule => "0 0/2 * * * *"
    statement => "select count(1) as count from [Tutoring].[tblAppointmentExtendedProperties]"
}

}
output {
elasticsearch {
protocol => http
index => "epsusage"
document_type => "epsusage"
host => "xxx.xxx.xxx.xxx"
port => "9200"
}
stdout { codec => rubydebug }
}

Hey Jorg
i have raised the issue

Please help