Huge data import from MySQL to ES failure

Hello Everybody,

I'm new in ES, trying to import 32m rows from MySQL to Elasticsearch using Logstash. It works fine, but when the cluster reached 3.5m it breaks and doesn't allow to add more data. I've increased heap size to 2GB, disabled the refresh_interval, but it still doesn't work. The hosting configuration is 4 Xeon CPU, 12GB RAM, ~200GB free on SSD.

The log show that the problem is allocation failure
2018-08-15T05:14:42.867-0700: 4980.112: [GC (Allocation Failure) 2018-08-15T05:14:42.867-0700: 4980.112: [ParNew
Desired survivor size 17432576 bytes, new threshold 1 (max 6)

And the cluster status is yellow, may it cause because I'm using a simple node?
{
"cluster_name":"elasticsearch",
"status":"yellow",
"timed_out":false,
"number_of_nodes":1,
"number_of_data_nodes":1,
"active_primary_shards":5,
"active_shards":5,
"relocating_shards":0,
"initializing_shards":0,
"unassigned_shards":5,
"delayed_unassigned_shards":0
number_of_pending_tasks":0,
"number_of_in_flight_fetch":0,
"task_max_waiting_in_queue_millis":0,
"active_shards_percent_as_number":50.0
}

Thank you for any advice!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.