[Orca-users] Problems running Orca 0.26 with large nummerbers of server to be monitored
Boeykens, Pieter [NCSBE]
pboeyke1 at ncsbe.jnj.com
Thu Feb 22 23:49:31 PST 2001
All,
I am using Orca version 0.26beta1 using RRDs version 1.000131.) to monitor
a large number of Sun servers (aprox 50) I run the orca data collector on
every box and send them over to a central server using rsync, there I
generate the graphs and have an webserver running. I know have raw data
worth serveral months, all the raw data files are compressed using gzip.
I thus end up with a data directory, consisting of subdirectories for every
sun server, on the central server, containing more than 3000 files.
On average every server subdir contains aprox 100 compressed files.
When I start op the Orca tool I get the following following output
Finding files and setting up data structures at Fri Feb 23 00:16:50 2001.
./bin/orca: warning: cannot open
`/opt/orca/var/orca/orcallator/ncsbex58/percol-
2001-02-14' for reading: Too many open files
./bin/orca: warning: shrinking maximum number open files to 58.
libthread panic: _sys_thread_create():alloc_thread returns 0 (no mem) (PID:
9263
LWP 1)
stacktrace:
ef523200
56b58
99620
8d2b0
a0efc
25edc
23490
23308
0
As you can see Orca reports a memory shortage and stops functioning, however
I still have 0.4Gb free memory at that moment.
I am using perl, version 5.005_03.
Any ideas what might cause this problem ?
Regards,
Pieter Boeykens
Johnson & Johnson
e-Business Systems Engineering
Networking & Computing Services EMEA
Turnhoutseweg 30
2340 Beerse
Belgium
Tel : +32-14-607157
Fax : +32-14-602640
More information about the Orca-users
mailing list