[Orca-users] Out of memory during request for 1016 bytes error
Steve Waltner
swaltner at lsil.com
Thu Dec 19 07:25:01 PST 2002
On Wednesday, December 18, 2002, at 05:24 PM, Michael O'Dea wrote:
> Hello all
>
> While orca is firing up and starting to read files I get this error:
>
> Out of memory during request for 1016 bytes, total sbrk() is 27999536
> bytes!
>
> after about 150 lines of
>
> /opt/orca-0.27b2/bin/orca: warning: cannot open
> `/opt/orca-0.27b2/orcallator/st-ivr/percol-2002-12-15' for reading:
> Too many open files
> /opt/orca-0.27b2/bin/orca: warning: cannot open
> `/opt/orca-0.27b2/orcallator/st-ivr/percol-2002-12-16' for reading:
> Too many open files
> /opt/orca-0.27b2/bin/orca: warning: cannot open
> `/opt/orca-0.27b2/orcallator/st-ivr/percol-2002-12-16' for reading:
> Too many open files
> /opt/orca-0.27b2/bin/orca: warning: cannot open
> `/opt/orca-0.27b2/orcallator/st-ivr/percol-2002-12-17' for reading:
> Too many open files
>
> Now the "too many open files" error I have always gotten - but this
> "Out of memory" error is new.
>
> Did I maybe reach some limit of rrd files that Orca can process?
>
> -m
Have you looked at the output of limit/ulimit? In addition to the file
descriptors limit that is mentioned in the FAQ
http://svn.orcaware.com:8000/repos/trunk/orca/FAQ at item 2.3, you can
also adjust the datasize or the max amount of RAM the process can
malloc(). You should used "ulimit -d" for sh variants and "limit
datasize" for csh variants of user shells. On my system, limit reports
the following, allowing me to allocate memory until swap space is
exhausted.
ra:~> limit
cputime unlimited
filesize unlimited
datasize unlimited
stacksize 8192 kbytes
coredumpsize 4096 kbytes
vmemoryuse unlimited
descriptors 256
ra:~> limit -h
cputime unlimited
filesize unlimited
datasize unlimited
stacksize unlimited
coredumpsize unlimited
vmemoryuse unlimited
descriptors 1024
ra:~>
Steve
More information about the Orca-users
mailing list