JDBC River for mysql

Hi,
My operating system: Linux Centos.
My platform: Elasticsearch 2.1.0.0 marvel plugin
My JDBC River plugin: elasticsearch-jdbc-2.1.0.0
My Mysql database : admin_ihale
My table: tbl_ihale

**I tried. My codes :

PUT /orders/order/1
{}

PUT /_river/my_jdbc_river/_meta
{
"type": "jdbc",
"jdbc": {
"strategy": "simple",
"digesting": true,
"fetchsize": "50",
"driver": "com.mysql.jdbc.Driver",
"url": "jdbc:mysql://localhost:3306/admin_ihale",
"user": "admin_ihale",
"password": "admin_ihale",
"sql": [
{
"statement": "select * from tbl_ihale where id = ?",
"parameter": [
"1001",
"US"
]
}
],
"index": "orders",
"type": "order",
"schedule": "0 */5 0-23 ? * ** *"
},"index": {
"index": "orders",
"type": "order",
"versioning": true
}}

My error:
{
"error": {
"root_cause": [
{
"type": "invalid_index_name_exception",
"reason": "Invalid index name [river], must not start with ''",
"index": "_river"
}
],
"type": "invalid_index_name_exception",
"reason": "Invalid index name [river], must not start with ''",
"index": "_river"
},
"status": 400
}

I want to move my data to elasticsearch. How do I data import from mysql to elasticsearch 2.1.0. Can you help me? I already have a JDBC river. Do I have to download the JDBC driver?

Read the documentation: https://github.com/jprante/elasticsearch-jdbc

Not a river anymore but a standalone application.

Also, you could look at logstash JDBC input.

Thank you. I tried but I have error.

My codes:
bin=$JDBC_IMPORTER_HOME/bin
lib=$JDBC_IMPORTER_HOME/lib
echo '{
"type" : "jdbc",
"jdbc" : {
"url" : "jdbc:mysql://localhost:3306/admin_ihale",
"user" : "admin_ihale",
"password" : "admin_ihale",
"sql" : "select , id as id from tbl_ihale"
}
}' | java
-cp "${lib}/
"
-Dlog4j.configurationFile=${bin}/log4j2.xml
org.xbib.tools.Runner
org.xbib.tools.JDBCImporter
My error:
Could not find or load main class org.xbib.tools.Runner