Hi
i am trying to import data from a postgreSQL DB however it never completes the imports becuase logstash runs out of Memory. If I limit the data in my select statement the data imports successfully.
Current JVM Arg -Xmx4000m -Xms256m -Xss2048k
The Data is 5G (7 475 233 records).
Logstash Config
INPUT:
input {
beats {
host => "0.0.0.0"
port => 10514
tags => "syslog_index"
}
beats {
host => "0.0.0.0"
port => 10516
tags => "metricbeat"
}
jdbc {
# Postgres jdbc connection string to our database
jdbc_connection_string => "jdbc:postgresql://server:5432/Database"
# The user we wish to execute our statement as
jdbc_user => "user"
jdbc_password => "password"
# The path to our downloaded jdbc driver
jdbc_driver_library => "/opt/logstash/postgresql-9.4-1204.jdbc41.jar"
# The name of the driver class for Postgresql
jdbc_driver_class => "org.postgresql.Driver"
last_run_metadata_path => "/opt/logstash/logstash_jdbc_last_run"
# our query
statement => "SELECT * FROM table"
tags => "test_index"
}
}
OUTPUT:
output {
if "test_index" in [tags] {
elasticsearch {
hosts => ["ES_SERVER:9200"]
index => "test_index"
document_type => "table"
document_id => "%{table_id}"
sniffing => false
timeout => "480"
}
}
}