[Orca-users] Re: Memory Errors
bias16
list2002 at bias.org
Tue Mar 12 21:08:50 PST 2002
Using Version 5.6.1 of Perl.
Re-starting the orca scripts once a day seems to be doing the trick.
I just pkill -9 orca and then restart them all.
I'm up to about 260 clients and it is working very well with 5-10
minutes delay on graphs.
- Liston
--- In orca-users at y..., Blair Zajac <blair at o...> wrote:
> bias16 wrote:
> >
> > Has anyone figured out how to restrict memory usage that perl is
> > using?
>
> No, not yet. Which version of Perl are you using?
>
> There may be some ways to modify Orca to release more memory when it
> has finished using it. I've tried to do a good job on deleting
hashes
> and arrays when they are no longer needed, but may have missed some.
>
> >
> > I got the following error message before a couple of my orca
processes
> > died about 4 days into running:
> >
> > Out of memory during "large" request for 135168 bytes, total
sbrk() is
> > 791121200 bytes at /usr/local/orca/lib/Orca/ImageFile.pm line 318.
> >
> > As indicated in earlier post, I am running everything in /tmp of a
> > T1400 with 4Gig Memory, 2Gig swap, and 4 CPU's. Orca files (rdd,
> > html, orcallator) take up about 2Gig of space.
> >
> > Has anyone had luck solving this problem but just start/kill the
orca
> > daemon script once a day?
>
> That's the only method.
>
> Best,
> Blair
> --
> Blair Zajac <blair at o...>
> Web and OS performance plots - http://www.orcaware.com/orca/
More information about the Orca-users
mailing list