Trying to extract data from MySQL

I am following this tutorial: http://r2d2.cc/2015/08/05/elastic-search-on-windows-with-mysql-sql-server-2014-and-import-feeder/, I want to extract data from mysql and index them using elastic. I am using:

  • ElasticSearch version: 6.2.1
  • elasticsearch-jdbc-2.3.4.1-dist

My jdbc_mysql.json:
{ "type":"jdbc", "jdbc":{ "url":"jdbc:mysql://localhost:3306/test", "user":"root", "sql":"SELECT * FROM test1", "index":"users", "type":"user", "autocommit":"true", "autodiscover":"true" } }
my jdb_mysql.ps1:
function Get-PSVersion {
if (test-path variable:psversiontable) {$psversiontable.psversion} else {[version]"1.0.0.0"}
}
$powershell = Get-PSVersion
if($powershell.Major -le 2 ){
Write-Error "Oh, so sorry, this script requires Powershell 3 (due to convertto-json)"
exit
}
if((Test-Path env:\JAVA_HOME) -eq $false){
Write-Error "Environment variable JAVA_HOME must be set to your java home"
exit
}
#curl -XDELETE "http://localhost:9200/users"

$DIR = "C:\Users\elasticsearch-6.1.2\plugins\elasticsearch-jdbc-2.3.4.1\"
$FEEDER_CLASSPATH="$DIR\lib"
$FEEDER_LOGGER="file:///$DIR\bin\log4j2.xml"
 
java -cp "$FEEDER_CLASSPATH\*" -"Dlog4j.configurationFile=$FEEDER_LOGGER" "org.xbib.tools.Runner" "org.xbib.tools.JDBCImporter" jdbc_mysql.json

and I am getting this error:

[11:43:58,752][INFO ][org.elasticsearch.org.xbib.elasticsearch.helper.client.TransportClient][pool-2-thread-1] [importer] failed to get node info for {#transport#-1}{127.0.0.1}{localhost/127.0.0.1:9300}, disconnecting...
org.elasticsearch.transport.NodeDisconnectedException: [][localhost/127.0.0.1:9300][cluster:monitor/nodes/liveness] disconnected
[11:43:58,752][ERROR][importer.jdbc ][pool-2-thread-1] error while processing request: no cluster nodes available, check settings {autodiscover=false, client.transport.ignore_cluster_name=false, client.transport.nodes_sampler_interval=5s, client.transport.ping_timeout=5s, cluster.name=elasticsearch, flush_interval=5s, max_actions_per_request=10000, max_concurrent_requests=8, max_volume_per_request=10mb, name=importer, port=9300, sniff=false}
org.elasticsearch.client.transport.NoNodeAvailableException: no cluster nodes available, check settings {autodiscover=false, client.transport.ignore_cluster_name=false, client.transport.nodes_sampler_interval=5s, client.transport.ping_timeout=5s, cluster.name=elasticsearch, flush_interval=5s, max_actions_per_request=10000, max_concurrent_requests=8, max_volume_per_request=10mb, name=importer, port=9300, sniff=false}
at org.xbib.elasticsearch.helper.client.BulkTransportClient.init(BulkTransportClient.java:164) ~[elasticsearch-helper-2.3.4.0.jar:?]
at org.xbib.elasticsearch.helper.client.ClientBuilder.toBulkTransportClient(ClientBuilder.java:113) ~[elasticsearch-helper-2.3.4.0.jar:?]
at org.xbib.elasticsearch.jdbc.strategy.standard.StandardSink.createClient(StandardSink.java:348) ~[elasticsearch-jdbc-2.3.4.1.jar:?]
at org.xbib.elasticsearch.jdbc.strategy.standard.StandardSink.beforeFetch(StandardSink.java:100) ~[elasticsearch-jdbc-2.3.4.1.jar:?]
at org.xbib.elasticsearch.jdbc.strategy.standard.StandardContext.beforeFetch(StandardContext.java:183) ~[elasticsearch-jdbc-2.3.4.1.jar:?]
at org.xbib.elasticsearch.jdbc.strategy.standard.StandardContext.execute(StandardContext.java:164) ~[elasticsearch-jdbc-2.3.4.1.jar:?]
at org.xbib.tools.JDBCImporter.process(JDBCImporter.java:203) ~[elasticsearch-jdbc-2.3.4.1.jar:?]
at org.xbib.tools.JDBCImporter.newRequest(JDBCImporter.java:189) [elasticsearch-jdbc-2.3.4.1.jar:?]
at org.xbib.tools.JDBCImporter.newRequest(JDBCImporter.java:53) [elasticsearch-jdbc-2.3.4.1.jar:?]
at org.xbib.pipeline.AbstractPipeline.call(AbstractPipeline.java:50) [elasticsearch-jdbc-2.3.4.1.jar:?]
at org.xbib.pipeline.AbstractPipeline.call(AbstractPipeline.java:16) [elasticsearch-jdbc-2.3.4.1.jar:?]
at java.util.concurrent.FutureTask.run(Unknown Source) [?:1.8.0_161]
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) [?:1.8.0_161]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) [?:1.8.0_161]
at java.lang.Thread.run(Unknown Source) [?:1.8.0_161]

Are you asking for assistance or just pasting what looks like an error?

assistance, but I am editing it ...

It looks like this very old tutorial is using the JDBC importer for Elasticsearch, which does not seem to support any Elasticsearch version beyond 2.3.4. Using Elasticsearch 6.x will therefore not work.

If you are looking to import data through JDBC, you may instead want to try Logstash and the JDBC input plugin, as this will work with the latest version of Elasticsearch.

Have a look at this blog post about it.

1 Like

thanks for your quick response, I will check it..

Yeah, I passed through this blog, but when I ran the first command "bin\logstash-plugin install logstash-input-jdbc" it gives the below error:

Validating logstash-input-jdbc
Installing logstash-input-jdbc
ERROR: Installation Aborted, message: undefined method `[]=' for #Bundler::Settings:0x621f23ac
Did you mean? []

I am using: logstash 6.1.2 with windows 10

The blog post is a bit old, so I believe the JDBC plugin now is bundled with Logstash by default, meaning you can just use it.

aha, I will try then ...

As it is a bit old, the documentation I linked to will be a better reference for how to configure it as this may have changed.

after running "bin\logstash -f simple_outpout.conf"

I got this error:

NoMethodError: undefined method `[]=' for #Bundler::Settings:0x32502377
Did you mean? []
setup! at C:/Users/logstash-6.1.2/lib/bootstrap/bundler.rb:50

at C:\Users\logstash-6.1.2\lib\bootstrap\environment.rb:63

my conf file:

input {
  jdbc {
    jdbc_driver_library => "mysql-connector-java-5.1.36-bin.jar"
    jdbc_driver_class => "com.mysql.jdbc.Driver"
    jdbc_connection_string => "jdbc:mysql://localhost:3306/mydb"
    jdbc_user => "root"
    schedule => "* * * * *"
    statement => "SELECT * from test1"
  }
}
output {
    stdout { codec => json_lines }
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.