[Orca-users] Memory Errors
bias16
list2002 at bias.org
Sat Mar 9 18:41:04 PST 2002
Has anyone figured out how to restrict memory usage that perl is
using?
I got the following error message before a couple of my orca processes
died about 4 days into running:
Out of memory during "large" request for 135168 bytes, total sbrk() is
791121200 bytes at /usr/local/orca/lib/Orca/ImageFile.pm line 318.
As indicated in earlier post, I am running everything in /tmp of a
T1400 with 4Gig Memory, 2Gig swap, and 4 CPU's. Orca files (rdd,
html, orcallator) take up about 2Gig of space.
Has anyone had luck solving this problem but just start/kill the orca
daemon script once a day?
Would adding swap help? It seems like it would just delay the problem
and slow things down, but may try it.
I've thought about purchasing a 280R for which I could add 16Gig
Dataram memory in. Unfortunately, it only has two processes
(although 900 UltraSparcIII) so I'm not sure if that would be better.
I would need to give up T1400 to purchase 280R.
Thanks,
Liston
More information about the Orca-users
mailing list