Java.lang.OutMemoryError


(Groot) #1

Hi,

I have to migrate mysql data to elasticsearch. My logstash configuration file is

Getting error when starting logstash

In logstash configuration file, changed the JVM configuration to 512m and 8g. But after restating, logstash giving the same error.
Please help me to resolve the issue.


(Javier) #2

Hi there:

Just to confirm, when you changed LS VM Memory HEAP Size you did change both mininum and maximum, did you?

How big is the database you're trying to migrate?


(Groot) #3

Thanks Xavy for reply...

Yes i changed both minimum and maximum value. And my database is of 12.5 GB.


(Javier) #4

Hi again :

Have you tried to temporarily increase the HEAP to at least the size of the database? Or probably incorporate the queries with some where statement to reduce the output size so it fit into the HEAP size


(Groot) #5

Thanks Xavy..

It means i have to assign 13gb heap size in JVM. But i have 16 gb RAM, so is it feasible to assign 13gb to JVM?

Let me try using where statement.


(Javier) #6

You should be able to assign that size... as long as you have enough free memory. Anyway, probably you would be too near from the limit system began to swap which obviously is not a good thing at all.

So we'll be waiting for you test using where statements, to check whether it might help.


(Groot) #7

Hi Xavy

After limiting the select statement, now it works. :slight_smile:


(Groot) #8

Hi Xavy,

Thanks a ton...

Can you please suggest any method so that i can migrate whole data.


(Javier) #9

Case the machine were a VM I absolutely would go for increasing RAM and increasing the max heap size to the whole size of the database.

Case machine is physical, you could stop any non needed service and check how much RAM you have available (with LS stopped) If then you have more than those 13gb free, then I would go for temporarily increase the Heap size.

Case RAM were not enough I would try doing the migration by parts, this is to run first with a particular select .. where, then with other.. and so on .

Another possibility so this might not take as much time would be to define several jdbc input entries, but I'm not sure this would work or not (depends on if LS processes those entries in a serialized mode - it should work - or in a paralellized mode - then java would run out of memory again). This being said, most probable thing is that inputs are handled in parallel so this might not be the solution (the single WHERE'd statements should)


(system) #10

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.