Jdbc client is not working in elastic search 6.5.1

cannot sync data from mysql to elastic search .

I tried to follow https://www.elastic.co/blog/logstash-jdbc-input-plugin

asked in stack overflow

logstash config

  # file: logstash-sample.conf
 input {
  jdbc {
    jdbc_connection_string => "jdbc:mysql://localhost:3306/test"
    jdbc_user => "root"
    jdbc_password => ""
    jdbc_validate_connection => true
    jdbc_driver_library => "C:\logstash-6.5.1\logstash-core\lib\jars\x-pack-sql-jdbc-6.5.1.jar"
    jdbc_driver_class => "com.mysql.jdbc.Driver"
    statement => "SELECT * from persons "
}
 }
 output {
  stdout { codec => json_lines }
   elasticsearch {
    
    index => "persons"
    document_type => "person"
    host => "localhost"
	
  }

}

Request GET
http://localhost:9200/persons/_search/?pretty

            {
                "error": {
                    "root_cause": [
                        {
                            "type": "index_not_found_exception",
                            "reason": "no such index",
                            "resource.type": "index_or_alias",
                            "resource.id": "persons",
                            "index_uuid": "_na_",
                            "index": "persons"
                        }
                    ],
                    "type": "index_not_found_exception",
                    "reason": "no such index",
                    "resource.type": "index_or_alias",
                    "resource.id": "persons",
                    "index_uuid": "_na_",
                    "index": "persons"
                },
                "status": 404
            }

What are logstash logs?
Moving your question to #logstash

since you we using windows, you must change

C:\logstash-6.5.1\logstash-core\lib\jars\x-pack-sql-jdbc-6.5.1.jar

to

C:/logstash-6.5.1/logstash-core/lib/jars/x-pack-sql-jdbc-6.5.1.jar

Also refer this example if you still have issues,

input {

jdbc {
jdbc_driver_library => "D:/Softwares/logstash-6.4.2/lib/com.mysql.jdbc_5.1.5.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_connection_string => "jdbc:mysql://localhost:3306/test"
jdbc_user => "root"
jdbc_password => "root"
statement => "SELECT * FROM sample"
jdbc_paging_enabled => "true"
jdbc_page_size => "50000"
}

}

output{
elasticsearch { codec => json hosts => ["localhost:9200"] index => "mysqldata" }
stdout { codec => rubydebug }
}

@afeef1915

Your syntax seems to be wrong.

It is hosts, I had made same mistake
elasticsearch {
index => "persons"
document_id => "%{person_id}"
hosts => ["http://192.168.0.197:9200"] // Have localhost or your IP address
}

Also look at the \

Updated Config

									jdbc {
					jdbc_driver_library => "C:/logstash-6.5.1/logstash-core/lib/jars/x-pack-sql-jdbc-6.5.1.jar"
					jdbc_driver_class => "com.mysql.jdbc.Driver"
					jdbc_connection_string => "jdbc:mysql://localhost:3306/test"
					jdbc_user => "root"
					jdbc_password => ""
					statement => "SELECT * FROM persons"
					jdbc_paging_enabled => "true"
					jdbc_page_size => "50000"
					}

					}

					output{
					elasticsearch { codec => json hosts => ["localhost:9200"] index => "persons" }
					stdout { codec => rubydebug }
					}

Errors

								C:\logstash-6.5.1\bin>logstash -f ../config/logstash-sample.conf
					Sending Logstash logs to C:/logstash-6.5.1/logs which is now configured via log4j2.properties
					[2018-12-06T19:14:44,960][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
					[2018-12-06T19:14:44,983][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"6.5.1"}
					[2018-12-06T19:14:49,094][INFO ][logstash.pipeline        ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
					[2018-12-06T19:14:49,494][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
					[2018-12-06T19:14:49,502][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
					[2018-12-06T19:14:49,688][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
					[2018-12-06T19:14:49,750][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
					[2018-12-06T19:14:49,755][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
					[2018-12-06T19:14:49,785][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
					[2018-12-06T19:14:49,811][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
					[2018-12-06T19:14:49,848][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
					[2018-12-06T19:14:50,417][INFO ][logstash.pipeline        ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x417a3bbb run>"}
					[2018-12-06T19:14:50,492][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
					[2018-12-06T19:14:50,963][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
					Loading class `com.mysql.jdbc.Driver'. This is deprecated. The new driver class is `com.mysql.cj.jdbc.Driver'. The driver is automatically registered via the SPI and manual loading of the driver class is generally unnecessary.
					[2018-12-06T19:14:51,961][INFO ][logstash.inputs.jdbc     ] (0.011913s) SELECT version()
					[2018-12-06T19:14:52,009][INFO ][logstash.inputs.jdbc     ] (0.000637s) SELECT version()
					[2018-12-06T19:14:52,157][INFO ][logstash.inputs.jdbc     ] (0.001019s) SELECT count(*) AS `count` FROM (SELECT * FROM persons) AS `t1` LIMIT 1
					[2018-12-06T19:14:52,202][INFO ][logstash.inputs.jdbc     ] (0.000885s) SELECT * FROM (SELECT * FROM persons) AS `t1` LIMIT 50000 OFFSET 0
					[2018-12-06T19:15:52,485][WARN ][logstash.outputs.elasticsearch] Marking url as dead. Last error: [LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError] Elasticsearch Unreachable: [http://localhost:9200/][Manticore::SocketTimeout] Read timed out {:url=>http://localhost:9200/, :error_message=>"Elasticsearch Unreachable: [http://localhost:9200/][Manticore::SocketTimeout] Read timed out", :error_class=>"LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError"}
					[2018-12-06T19:15:52,495][WARN ][logstash.outputs.elasticsearch] Marking url as dead. Last error: [LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError] Elasticsearch Unreachable: [http://localhost:9200/][Manticore::SocketTimeout] Read timed out {:url=>http://localhost:9200/, :error_message=>"Elasticsearch Unreachable: [http://localhost:9200/][Manticore::SocketTimeout] Read timed out", :error_class=>"LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError"}
					[2018-12-06T19:15:52,513][WARN ][logstash.outputs.elasticsearch] Marking url as dead. Last error: [LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError] Elasticsearch Unreachable: [http://localhost:9200/][Manticore::SocketTimeout] Read timed out {:url=>http://localhost:9200/, :error_message=>"Elasticsearch Unreachable: [http://localhost:9200/][Manticore::SocketTimeout] Read timed out", :error_class=>"LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError"}
					[2018-12-06T19:15:52,519][ERROR][logstash.outputs.elasticsearch] Attempted to send a bulk request to elasticsearch' but Elasticsearch appears to be unreachable or down! {:error_message=>"Elasticsearch Unreachable: [http://localhost:9200/][Manticore::SocketTimeout] Read timed out", :class=>"LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError", :will_retry_in_seconds=>2}
					[2018-12-06T19:15:52,519][ERROR][logstash.outputs.elasticsearch] Attempted to send a bulk request to elasticsearch' but Elasticsearch appears to be unreachable or down! {:error_message=>"Elasticsearch Unreachable: [http://localhost:9200/][Manticore::SocketTimeout] Read timed out", :class=>"LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError", :will_retry_in_seconds=>2}

Errors

         [2018-12-06T19:21:32,231][ERROR][logstash.outputs.elasticsearch] 
		Attempted to send a bulk request to elasticsearch' but Elasticsearch appears to be unreachable or down! 
		{:error_message=>"Elasticsearch Unreachable: [http://localhost:9200/][Manticore::SocketTimeout] Read timed out", 
		:class=>"LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError", :will_retry_in_seconds=>64

Thanks solution worked for me.

cheers!! Mark the answer as solution

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.