Issue with jdbc for mariadb


(raba) #1

Hi,
I'm honestly trying to browse for similar issue related to this topic, but not able to find any related..so, please excuse if my search engine is wrong, but most likely i will need a help with troubleshooting, why my configuration of extracting data from mariadb is not working so far:

logstash conf file:

jdbc {
jdbc_driver_library => "/etc/logstash/jdbc-drivers/mariadb-java-client-2.2.1-sources.jar"
jdbc_driver_class => "org.mariadb.jdbc.Driver"
jdbc_connection_string => "jdbc:mariadb://overall:3306/tasks"
jdbc_user => "..."
jdbc_password => "..."
jdbc_validate_connection => true
jdbc_page_size => 50000
jdbc_paging_enabled => true
schedule => "* * * * *"
statement => "SELECT * from taskoverview"
sql_log_level => "debug"
type => "task-data"
}

output {
if [type] =~ "task-*" {
elasticsearch {
hosts => "elastic:9200"
manage_template => true
index => "task-data-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
document_id => "%{taskid}"
}
}

in logs:

Error: org.mariadb.jdbc.Driver not loaded. Are you sure you've included the correct jdbc driver in :jdbc_driver_library?

What is wrong with configuration? What 'include' is meaning?
:logstash-input-jdbc installed already
java version "1.8.0_151"


(raba) #2

hm, after reading of mariadb connector doc https://mariadb.com/kb/en/library/about-mariadb-connector-j/
i was moving further, but still without success.
Now i'm getting 'permission denied' for what?

Plugin: <LogStash::Inputs::Jdbc jdbc_driver_library=>"/etc/logstash/jdbc-drivers/mariadb-java-client-2.2.1.jar", jdbc_driver_class=>"org.mariadb.jdbc.Driver", jdbc_connection_string=>"jdbc:mariadb://overall:3306/tasks", jdbc_user=>"...", jdbc_password=>, sql_log_level=>"debug", statement=>"SELECT * from taskoverview", type=>"task-data", id=>"c8de2c09609c6c678d21dfab572c248a97e72669-9", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_11b4e6c1-530a-4969-8040-0adc0b6532e6", enable_metric=>true, charset=>"UTF-8">, jdbc_paging_enabled=>false, jdbc_page_size=>100000, jdbc_validate_connection=>false, jdbc_validation_timeout=>3600, jdbc_pool_timeout=>5, connection_retry_attempts=>1, connection_retry_attempts_wait_time=>0.5, parameters=>{"sql_last_value"=>2018-01-19 16:01:16 UTC}, last_run_metadata_path=>"/.logstash_jdbc_last_run", use_column_value=>false, tracking_column_type=>"numeric", clean_run=>false, record_last_run=>true, lowercase_column_names=>true>
Error: Permission denied - /.logstash_jdbc_last_run

permission on db is set already


(raba) #3

record_last_run => "true"
was the reason of 'permission denied'...not related...now the log look ok

[2018-01-19T16:43:00,590][INFO ][logstash.inputs.jdbc ] (0.075000s) SELECT * FROM taskoverview

I'm completely blind on this...all ongoing issue on the start of this topic cleared already, but the data is still not ingested from maria -> elasticsearch
if you want and can, please help


(Magnus Bäck) #4

Temporarily replace your elasticsearch output with a stdout { codec => rubydebug } output and run Logstash again. What happens?


(raba) #5

The issue was coming from table field called as 'type', i suppose conflict with internal elastic already registered "_type". We omit it for now to moving further and now the data is collected from 'mariadb', but another one issue is on place:
when we grab the data via jdbc connector from database daily, the next day we have a duplicate entry from the previous grab....is there any way to prevent it, or make it works like incremental(just new item to be added)?


(Magnus Bäck) #6

Read the jdbc input docs and pay special attention to the sql_last_value parameter.


(raba) #7

Yeah, double read it fix the issue))
You are awesome again :wink:
Thanks a lot!


(system) #8

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.