Vote count:
0
I'm writing a parser that loads rather large files 400+mb and parses them in about 32mb chunks, then saves the parsed data to disk. I save the data by having a thread with a synchronised list, the thread checks the list periodically and saves anything that's there. I then delete that element from the List. However the VM memory use continues to grow.
It's very fast when the Java virtual machine VM memory size is very big, but obviously slows when it reaches a cap. I can load a 400 mb file in a 300 mb of memory, but this is really slow.
Why is it that even with objects that I don't use any more they persist in memory, but are absolutely fine to be deallocated by the garbage collector (which is really slow).
How do I prevent the heap from becoming huge?
Aucun commentaire:
Enregistrer un commentaire