Logstash5.5.0 Out of memory issue

Hi,
ELK is a great tool for advanced analytics on huge data.

we are using ELK 5.5.0 version.

When we run the logstash it is continuously crashing every minute on 1 gb of data import

sample logstash.conf File

input{
jdbc {
type => "type1"
jdbc_driver_library => "mysql-connector-java-5.1.35"
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_connection_string => "jdbc:mysql://ip:port/schema"
jdbc_user => "user"
jdbc_password => "password"
jdbc_paging_enabled => "true"
lowercase_column_names => "false"
jdbc_fetch_size => 1000
schedule => "1 * * * * *"
last_run_metadata_path => "lastrunfile"
statement_filepath => "sql1"
}
jdbc {
type => "type2"
jdbc_driver_library => "mysql-connector-java-5.1.35"
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_connection_string => "jdbc:mysql://ip:port/schema"
jdbc_user => "user"
jdbc_password => "pwd"
jdbc_paging_enabled => "true"
lowercase_column_names => "false"
jdbc_fetch_size => 1000
schedule => "1 * * * * *"
last_run_metadata_path => "lastrunpath"
statement_filepath => "sql2"
}
}
output {
if [type] == "type1" {
elasticsearch {
hosts => ["ip:port"]
manage_template => false
index => "index1"
document_id => "%{doc_id}"
}
}
if [type] == "type2" {
elasticsearch {
hosts => ["ip:port"]
manage_template => false
index => "index2"
document_id => "%{doc_id}"
}
}
}

Anything is problem in the config. and also provided xmx option to 2gb, but still crashing not able to create indexes only

FYI we’ve renamed ELK to the Elastic Stack, otherwise Beats feels left out :wink:

Try reducing jdbc_fetch_size?

Let me try with Reduced fetch size.
Thank you :grinning:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.