Hi! I trying to upload Big Data in elasticsearch, but always I have java exception, if I dont have LIMIT in SQL-query.
How I can to upload Big Data with jdbs plugins(I expanded alowable memory size, but it did not help)
What Java exception? What does your Logstash configuration look like?
input {
jdbc {
jdbc_driver_library => "path"
jdbc_driver_class => "Driver"
jdbc_connection_string => "path"
jdbc_user => "user"
jdbc_password => "password"
statement => '
select
*
from
tables
where "Id" > :sql_last_value
'
use_column_value => true
tracking_column => "id"
tracking_column_type => "numeric"
last_run_metadata_path => "path"
schedule => "* /1 * * *"
id => "id"
}
}
filter {
date {
match => ["dt", "yyyy-MM-dd HH:mm:ss.SSS"]
timezone => "Etc/UTC"
remove_field => ["dt"]
}
}
output {
elasticsearch {
hosts => "http://localhost:9200"
index => "index"
document_id => "%{id}"
}
}
Java exception : call site initialization exception
Full error message please.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.