Data not showing when linking Mysql with ElasticSearch using logstash

Hi! I am trying to link mysql with ElasticSearch using logstash. The database is getting connected but the schemas and the data are not being shown.
My .conf file:
input {
** jdbc {**
** jdbc_connection_string => "jdbc:mysql://mylocalhost/persondetails"**
** jdbc_user => "root"**
** jdbc_password => "mypassword"**
** schedule => "* * * * "*
** jdbc_validate_connection => true**
** jdbc_driver_library => "/usr/local/Cellar/logstash/5.5.2/mysql-connector-java-3.1.14/mysql-connector-java-3.1.14-bin.jar"**
** jdbc_driver_class => "com.mysql.jdbc.Driver"**
** statement => "SELECT * FROM Document"**
** }**
}
output {
** elasticsearch {**
** #protocol=>http**
** index =>"persondetails"**
** document_type => "Document"**
** document_id => "%{idDocument}"**
** hosts => ["http://localhost:9200"]**
** codec => json_lines**
** }**
}
Any problem with the configuration file?
The output is as follows:
Sending Logstash's logs to /usr/local/Cellar/logstash/5.5.2/libexec/logs which is now configured via log4j2.properties
[2017-08-28T14:29:43,857][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2017-08-28T14:29:43,873][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
[2017-08-28T14:29:44,075][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2017-08-28T14:29:44,077][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2017-08-28T14:29:44,184][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2017-08-28T14:29:44,195][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://localhost:9200"]}
[2017-08-28T14:29:44,201][INFO ][logstash.pipeline ] Starting pipeline {"id"=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>500}
[2017-08-28T14:29:44,417][INFO ][logstash.pipeline ] Pipeline main started
[2017-08-28T14:29:44,506][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2017-08-28T14:30:03,933][INFO ][logstash.inputs.jdbc ] (2.902000s) SELECT * FROM Document
** [2017-08-28T14:32:27,925][INFO ][logstash.inputs.jdbc ] (2.801000s) SELECT * FROM Document**
[2017-08-28T14:34:43,540][INFO ][logstash.inputs.jdbc ] (2.166000s) SELECT * FROM Document
[2017-08-28T14:36:58,760][INFO ][logstash.inputs.jdbc ] (2.377000s) SELECT * FROM Document

codec => json_lines

Remove this.

Is the jdbc input producing anything? Use a stdout { codec => rubydebug } to debug the situation.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.