[Orca-checkins] rev 98 - in trunk/orca: . lib src
blair at orcaware.com
blair at orcaware.com
Sat Jul 13 18:37:01 PDT 2002
Author: blair
Date: Fri, 28 Jun 2002 21:56:14 -0700
New Revision: 98
Added:
trunk/orca/ARCHITECTURE
trunk/orca/REQUIREMENTS
trunk/orca/lib/time_gets.cfg
Modified:
trunk/orca/CHANGES
trunk/orca/INSTALL
trunk/orca/Makefile.in
trunk/orca/TODO
trunk/orca/lib/percollator.cfg.in
trunk/orca/src/orca.pl
trunk/orca/src/percol_column.pl
trunk/orca/src/percol_running.pl.in
Log:
Load orca-0.17 into trunk/orca.
Added: trunk/orca/ARCHITECTURE
==============================================================================
--- trunk/orca/ARCHITECTURE (original)
+++ trunk/orca/ARCHITECTURE Sat Jul 13 18:36:52 2002
@@ -0,0 +1,122 @@
+Orca::OpenFileHash
+Orca::HTMLFile
+Orca::DataFile
+Orca::SourceDataFile
+
+This file describes Orca's internal design.
+
+Orca makes use of several internal classes (objects). They are:
+
+ Orca::OpenFileHash
+ Orca::HTMLFile
+ Orca::DataFile
+ Orca::SourceDataFile is a subclass of Orca::DataFile
+ Orca::RRDFile is a subclass of Orca::DataFile
+ Orca::GIFFile
+
+Orca::OpenFileHash
+
+ This class provides a cache of open file descriptors to the
+ user of the class. Upon creation, Orca::OpenFileHash is
+ told how many open file descriptors to cache. When a file
+ descriptor is needed for a file, the filename is passed to
+ a method and if the descriptor is already open, then it
+ is given back to the caller, otherwise the file is opened
+ and the newly opened descritor is passed back. If there are
+ already a maximum number of file descriptors open, then the
+ class closes the least recently used one.
+
+ This class is used by the Orca::SourceDataFile to keep
+ files open as Orca waits for file updates.
+
+ constructor new
+ method open
+ method add
+ method close
+ method change_weight
+ method list
+ method select
+ method get_fd
+ method sysread_readline
+ method is_open
+ hidden method _close_extra
+
+Orca::HTMLFile
+
+ This class is basically a object file descriptor that can be
+ printed to. This class does not provide the same level of
+ abstraction that IO::File does. It only supports the print
+ method.
+
+ What this class does is upon creation of an HTML file prints a
+ standard amount of HTNL to the beginning of the file. This
+ includes the standard <html> and other tags. Then, when the
+ object is destroyed, the DESTORY method writes some trailing
+ HTML to the file.
+
+ constructor new
+ method print
+ hidden method DESTROY
+
+Orca::DataFile
+
+ Orca::DataFile is a class meant to be subclassed by other
+ classes. What is does is cache file information, such as the
+ inode number, the device number of the mount point, the
+ files modification time (mtime). The class can be instructed
+ to update the cache upon demand and to return the time when
+ the file was last stated.
+
+ The file information is cached to save file access and system
+ call overhead that can be prevented.
+
+ constructor new
+ method filename
+ method file_dev
+ method file_ino
+ method file_mtime
+ method last_stat_time
+ method update_stat
+ method status
+
+Orca::SourceDataFile
+
+ Orca::SourceDataFile subclasses itself from Orca::DataFile.
+ This classes primary mission is to load and parse data from
+ source text files and hand it off.
+
+ constructor new
+ method is_current
+ method next_load_time
+ method get_column_names
+ method get_date_column
+ method add_plots
+ method load_new_data
+ method rrds
+
+
+Orca::RRDFile
+
+ constructor new
+ method name
+ method rrd_update_time
+ method add_gif
+ method created_gifs
+ method queue_data
+ method flush_data
+
+Orca::GIFFile
+
+ constructor new
+ method add_rrds
+ method rrds
+ method plot_ref
+ method group
+ method files_key
+ method name
+ method no_group_name
+ method plot_end_time
+ method plot
+ hidden method _plot
+ hidden sub _expire_string
+
Modified: trunk/orca/Makefile.in
==============================================================================
--- trunk/orca/Makefile.in (original)
+++ trunk/orca/Makefile.in Sat Jul 13 18:36:53 2002
@@ -11,7 +11,7 @@
install:
./config/mkinstalldirs $(PERCOLLATOR_DIR)
- ./config/mkinstalldirs $(RRD_DIR)/percollator
+ ./config/mkinstalldirs $(RRD_DIR)
@for dir in $(SUBDIRS); do \
echo "cd $$dir; $(MAKE) install"; \
(cd $$dir; $(MAKE) install); \
Added: trunk/orca/REQUIREMENTS
==============================================================================
--- trunk/orca/REQUIREMENTS (original)
+++ trunk/orca/REQUIREMENTS Sat Jul 13 18:36:53 2002
@@ -0,0 +1,30 @@
+Orca::SourceDataFile
+
+ Requirements
+ Manages one source data file
+ Knows when the data file will be updated
+ Only class to know the internal format of the file
+ Can have brand new data fields at any point in time
+
+ Internal Storage
+
+ Methods
+
+Orca::RRDFile
+
+ Requirements
+ Manages on RRD file
+ RRD file can contain data resulting from arbitrary math
+ expressions.
+
+ Internal Storage
+
+ Methods
+
+Orca::GIFFile
+
+ Requirements
+
+ Internal Storage
+
+ Methods
Modified: trunk/orca/TODO
==============================================================================
--- trunk/orca/TODO (original)
+++ trunk/orca/TODO Sat Jul 13 18:36:53 2002
@@ -1,12 +1,19 @@
Orca:
- Lock file
- Arbitrary date reading
+ Come up with a better error scheme than using warn() for some
+ errors and the email warn for others.
+ Do something better if the number of columns changes in a single
+ percollator file.
+ Have a scheme were at any point of time a data file may add or
+ change the number of columns of data and column names.
+ Lock file.
+ Arbitrary date reading.
Use Cricket's configuration ConfigTree?????
- More configuration file defaults
- Better date loading support
+ More configuration file defaults.
+ Better date loading support.
Make plots from multiple files sets: delete source files_key and put
- it into data
- Update HTML files if a new file is found with a new group
+ it into data.
+ Update HTML files if a new file is found with a new group.
percollator.se:
- Better documentation
+ Better documentation.
+ Update using 3.1 diffs from 3.0.
Modified: trunk/orca/INSTALL
==============================================================================
--- trunk/orca/INSTALL (original)
+++ trunk/orca/INSTALL Sat Jul 13 18:36:53 2002
@@ -3,14 +3,14 @@
2) Install necessary Perl modules.
a) Install Math::IntervalSearch version 1.00 or greater.
b) Install Digest::MD5 version 2.00 or greater.
- c) Install RRDs version 0.99.1 or greater.
+ c) Install RRDs version 0.99.6 or greater.
3) Decide where Orca's binaries, RRD, HTML, and percollator directories
will reside. Make sure performance concerns are handled.
4) Configure Orca.
5) Install Orca.
6) [Optional] Install percollator.
a) Install the SE toolkit.
- b) Apply a patch to the SE toolkit.
+ b) Apply a patch to the SE 3.0 toolkit.
c) Examine Orca/percollator programs.
d) Run start_percol on all systems.
e) Edit percollator.cfg.
@@ -23,12 +23,10 @@
work with older versions of Perl. I welcome feedback if Orca works
with older Perls.
- This step is too large to go into here. The bottom line is to get
- the latest Perl from
+ This step is too large to go into here. The bottom line is to
+ follow the instructions at
- ftp://ftp.funet.fi/pub/languages/perl/CPAN/src/stable.tar.gz
-
- and compile and install it.
+ http://language.perl.com/info/software.html
2) Install necessary Perl modules.
a) Install Math::IntervalSearch version 1.00 or greater.
@@ -49,16 +47,16 @@
Download Digest::MD5 from:
- http://www.perl.com/CPAN/authors/id/GAAS/Digest-MD5-2.01.tar.gz
+ http://www.perl.com/CPAN/authors/id/GAAS/Digest-MD5-2.02.tar.gz
- % gunzip -c Digest-MD5-2.01.tar.gz | tar xvf -
- % cd Digest-MD5-2.01
+ % gunzip -c Digest-MD5-2.02.tar.gz | tar xvf -
+ % cd Digest-MD5-2.02
% perl Makefile.PL
% make
% make test
% make install
- c) Install RRDs version 0.99.1 or greater.
+ c) Install RRDs version 0.99.6 or greater.
Download RRDs from:
@@ -151,7 +149,11 @@
http://www.sun.com/sun-on-net/performance/se3/
- b) Apply a patch to the SE toolkit.
+ If you are running 2.6 or greater, then download SE 3.1 or greater.
+ Otherwise you will need SE 3.0.
+
+ b) Apply a patch to the SE 3.0 toolkit. If you are running any other
+ release of SE, then do not install the patch.
By default the SE toolkit will install into /opt/RICHPse.
Run this command:
Modified: trunk/orca/lib/percollator.cfg.in
==============================================================================
--- trunk/orca/lib/percollator.cfg.in (original)
+++ trunk/orca/lib/percollator.cfg.in Sat Jul 13 18:36:53 2002
@@ -1,7 +1,7 @@
# Orca configuration file for Percollator files.
-base_dir @RRD_DIR@
-state_file percollator.state
+base_dir @RRD_DIR@/percollator
+state_file orca.state
html_dir @HTML_DIR@
expire_gifs 1
@@ -15,7 +15,7 @@
# This defines the email address of people to warn when a file that is
# being updated constantly stops being updated. For mathematical
-# expressions use the word interval to get the interval number for
+# expressions use the word `interval' to get the interval number for
# the data source.
warn_email root at localhost
late_interval interval + 30
@@ -125,8 +125,8 @@
source percol
data httpop/s
data http/p5s
-legend 5 minute average hits/s
-legend Peak 5 second interval hits/s
+legend 5 min average hits/s
+legend Peak 5 second hits/s
y_legend Hits/second
data_min 0
optional
@@ -372,8 +372,8 @@
title %g TCP Number Open Connections
source percol
data tcp_estb
-legend Number Open TCP Connections
-y_legend Number Open Connections
+legend # Open Connections
+y_legend Number Open TCP Connections
data_min 0
data_max 50000
}
@@ -481,7 +481,7 @@
title %g Inode Steal Rate
source percol
data inod_stl/s
-legend Inode w/page steal rate
+legend Inode w/page steals/s
y_legend Steals/s
data_min 0
}
Added: trunk/orca/lib/time_gets.cfg
==============================================================================
--- trunk/orca/lib/time_gets.cfg (original)
+++ trunk/orca/lib/time_gets.cfg Sat Jul 13 18:36:53 2002
@@ -0,0 +1,540 @@
+# Orca configuration file timing HTTP GETs.
+
+base_dir @RRD_DIR@/time_gets
+state_file orca.state
+html_dir @HTML_DIR@/time_gets
+expire_gifs 1
+
+# Find files at the following times:
+# 0:10 to pick up new percollator files for the new day.
+# 1:00 to pick up late comer percollator files for the new day.
+# 6:00 to pick up new files before the working day.
+# 12:00 to pick up new files during the working day.
+# 19:00 to pick up new files after the working day.
+find_times 0:10 1:00 6:00 12:00 19:00
+
+# This defines the email address of people to warn when a file that is
+# being updated constantly stops being updated. For mathematical
+# expressions use the word interval to get the interval number for
+# the data source.
+warn_email root at localhost
+late_interval interval + 30
+
+# This defines where the find the source data files and the format of those
+# files.
+files time_gets {
+find_files /home/bzajac/time_gets/(.*)/data.\d{8}
+column_description first_line
+date_source column_name timestamp
+date_format %s
+interval 300
+reopen 1
+}
+
+html_top_title GeoCities Host Status
+
+html_page_header
+ <table border=0 cellspacing=0 cellpadding=0 width="100%">
+ <tr>
+ <td><a href="http://www.geocities.com">
+ <img border=0 alt="GeoCities"
+ src="http://pic.geocities.com/images/main/hp/logo_top.gif"
+ width=126 height=58></a>
+ </td>
+ </tr>
+ <tr>
+ <td><a href="http://www.geocities.com">
+ <img border=0 alt="GeoCities"
+ src="http://pic.geocities.com/images/main/hp/tagline.gif"
+ width=124 height=36></a>
+ </td>
+ </tr>
+ </table>
+ <spacer type=vertical size=4>
+
+html_page_footer
+ <spacer type=vertical size=20>
+ <font face="Arial,Helvetica">
+ These plots brought to you by your local system administrator.
+ </font>
+
+plot {
+title %g Average # Processes in Run Queue
+source percol
+data 1runq
+data 5runq
+data 15runq
+legend 1 Minute Average
+legend 5 Minute Average
+legend 15 Minute Average
+y_legend # Processes
+data_min 0
+data_max 100
+}
+
+plot {
+title %g System Load
+source percol
+data 1load
+data 5load
+data 15load
+legend 1 Minute Average
+legend 5 Minute Average
+legend 15 Minute Average
+y_legend Load
+data_min 0
+data_max 200
+}
+
+plot {
+title %g Number of System & Httpd Processes
+source percol
+data #proc
+data #httpds
+line_type line1
+line_type area
+legend System total
+legend Number httpds
+y_legend # Processes
+data_min 0
+data_max 10000
+}
+
+plot {
+title %g CPU Usage
+source percol
+data usr%
+data sys%
+data 100 - usr% - sys%
+legend User
+legend System
+legend Idle
+line_type area
+line_type stack
+line_type stack
+y_legend Percent
+data_min 0
+data_max 100
+plot_min 0
+plot_max 100
+rigid_min_max 1
+}
+
+plot {
+title %g Web Server Hit Rate
+source percol
+data httpop/s
+data http/p5s
+legend 5 minute average hits/s
+legend Peak 5 second interval hits/s
+y_legend Hits/second
+data_min 0
+optional
+}
+
+plot {
+title %g Web Server File Size
+source percol
+data %to1KB
+data %to10KB
+data %to100KB
+data %to1MB
+data %over1MB
+line_type area
+line_type stack
+line_type stack
+line_type stack
+line_type stack
+legend 0 - 1 KB
+legend 1 - 10 KB
+legend 10 - 100 KB
+legend 100 - 1000 KB
+legend Greater than 1 MB
+y_legend Percent
+data_min 0
+data_max 100
+plot_min 0
+plot_max 100
+rigid_min_max 1
+}
+
+plot {
+title %g Web Server Data Transfer Rate
+source percol
+data httpb/s
+legend Bytes/s
+y_legend Bytes/s
+data_min 0
+}
+
+plot {
+title %g Web Server HTTP Error Rate
+source percol
+data htErr/s
+legend HTTP Errors/s
+y_legend Errors/s
+data_min 0
+}
+
+plot {
+title %g Bits Per Second: be0
+source percol
+data 1024 * 8 * be0InKB/s
+data 1024 * 8 * be0OuKB/s
+line_type area
+legend Input
+legend Output
+y_legend bits/s
+data_min 0
+data_max 100000000
+optional
+}
+
+plot {
+title %g Bits Per Second: hme0
+source percol
+data 1024 * 8 * hme0InKB/s
+data 1024 * 8 * hme0OuKB/s
+line_type area
+legend Input
+legend Output
+y_legend bits/s
+data_min 0
+data_max 100000000
+optional
+}
+
+plot {
+title %g Bits Per Second: hme1
+source percol
+data 1024 * 8 * hme1InKB/s
+data 1024 * 8 * hme1OuKB/s
+line_type area
+legend Input
+legend Output
+y_legend bits/s
+data_min 0
+data_max 100000000
+optional
+}
+
+plot {
+title %g Bits Per Second: hme2
+source percol
+data 1024 * 8 * hme2InKB/s
+data 1024 * 8 * hme2OuKB/s
+line_type area
+legend Input
+legend Output
+y_legend bits/s
+data_min 0
+data_max 100000000
+optional
+}
+
+plot {
+title %g Bits Per Second: le0
+source percol
+data 1024 * 8 * le0InKB/s
+data 1024 * 8 * le0OuKB/s
+line_type area
+legend Input
+legend Output
+y_legend bits/s
+data_min 0
+data_max 10000000
+optional
+}
+
+plot {
+title %g Bits Per Second: le1
+source percol
+data 1024 * 8 * le1InKB/s
+data 1024 * 8 * le1OuKB/s
+line_type area
+legend Input
+legend Output
+y_legend bits/s
+data_min 0
+data_max 10000000
+optional
+}
+
+plot {
+title %g Packets Per Second: $1
+source percol
+data (.*\d)Ipkt/s
+data $1Opkt/s
+line_type area
+legend Input
+legend Output
+y_legend Packets/s
+data_min 0
+data_max 100000
+flush_regexps 1
+}
+
+plot {
+title %g Errors Per Second: $1
+source percol
+data (.*\d)IErr/s
+data $1OErr/s
+line_type area
+legend Input
+legend Output
+y_legend Errors/s
+data_min 0
+flush_regexps 1
+}
+
+plot {
+title %g Ethernet Nocanput Rate
+source percol
+data (.*\d)NoCP/s
+legend $1
+y_legend Nocanput/s
+data_min 0
+flush_regexps 1
+}
+
+plot {
+title %g Ethernet Deferred Packet Rate
+source percol
+data (.*\d)Defr/s
+legend $1
+y_legend Defers/s
+data_min 0
+flush_regexps 1
+}
+
+plot {
+title %g Ethernet Collisions
+source percol
+data (.*\d)Coll%
+legend $1
+y_legend Percent
+data_min 0
+data_max 200
+flush_regexps 1
+}
+
+plot {
+title %g TCP Bits Per Second
+source percol
+data 1024 * 8 * tcp_InKB/s
+data 1024 * 8 * tcp_OuKB/s
+line_type area
+legend Input
+legend Output
+y_legend bits/s
+data_min 0
+data_max 1000000000
+}
+
+plot {
+title %g TCP Segments Per Second
+source percol
+data tcp_Iseg/s
+data tcp_Oseg/s
+line_type area
+legend Input
+legend Output
+y_legend Segments/s
+data_min 0
+data_max 20000
+}
+
+plot {
+title %g TCP Retransmission & Duplicate Received Percentage
+source percol
+data tcp_Ret%
+data tcp_Dup%
+legend Retransmission
+legend Duplicate Received
+y_legend Percent
+data_min 0
+data_max 200
+}
+
+plot {
+title %g TCP New Connection Rate
+source percol
+data tcp_Icn/s
+data tcp_Ocn/s
+legend Input - Passive
+legend Output - Active
+y_legend Connections/s
+data_min 0
+data_max 10000
+}
+
+plot {
+title %g TCP Number Open Connections
+source percol
+data tcp_estb
+legend Number Open TCP Connections
+y_legend Number Open Connections
+data_min 0
+data_max 50000
+}
+
+plot {
+title %g TCP Reset Rate
+source percol
+data tcp_Rst/s
+legend Number TCP Resets/s
+y_legend Resets/s
+data_min 0
+}
+
+plot {
+title %g TCP Attempt Fail Rate
+source percol
+data tcp_Atf/s
+legend TCP Attempt Fails/s
+y_legend Atf/s
+data_min 0
+}
+
+plot {
+title %g TCP Listen Drop Rate
+source percol
+data tcp_Ldrp/s
+data tcp_LdQ0/s
+data tcp_HOdp/s
+legend TCP Listen Drops
+legend TCP Listen Drop Q0
+legend TCP Half Open Drops
+data_min 0
+}
+
+plot {
+title %g Sleeps on Mutex Rate
+source percol
+data smtx
+data smtx/cpu
+legend Sleeps on mutex
+legend Sleeps on mutex/cpu
+y_legend Smtx/s
+data_min 0
+}
+
+plot {
+title %g NFS Call Rate
+source percol
+data nfs_call/s
+legend NFS Calls/s
+y_legend Calls/s
+data_min 0
+}
+
+plot {
+title %g NFS Timeouts & Bad Transmits Rate
+source percol
+data nfs_timo/s
+data nfs_badx/s
+legend NFS Timeouts
+legend Bad Transmits
+y_legend Count/s
+data_min 0
+}
+
+plot {
+title %g Peak & Mean Disk Busy
+source percol
+data disk_peak
+data disk_mean
+line_type line1
+line_type area
+legend Peak Disk Busy
+legend Mean Disk Busy
+y_legend Disk Busy Measure
+data_min 0
+}
+
+plot {
+title %g Cache Hit Percentages
+source percol
+data dnlc_hit%
+data inod_hit%
+legend DNLC
+legend Inode Cache
+y_legend Percent
+data_min 0
+data_max 100
+}
+
+plot {
+title %g Cache Reference Rate
+source percol
+data dnlc_ref/s
+data inod_ref/s
+line_type line1
+line_type area
+legend DNLC
+legend Inode Cache
+y_legend References/s
+data_min 0
+}
+
+plot {
+title %g Inode Steal Rate
+source percol
+data inod_stl/s
+legend Inode w/page steal rate
+y_legend Steals/s
+data_min 0
+}
+
+plot {
+title %g Available Swap Space
+source percol
+data 1024 * swap_avail
+legend Available Swap Space
+y_legend Bytes
+data_min 0
+}
+
+plot {
+title %g Page Residence Time
+source percol
+data page_rstim
+legend Page Residence Time
+y_legend Seconds
+data_min 0
+}
+
+plot {
+title %g Page Usage
+source percol
+data pp_kernel
+data free_pages
+data pagestotl - pp_kernel - free_pages
+data pagestotl
+line_type area
+line_type stack
+line_type stack
+line_type line2
+legend Kernel
+legend Free List
+legend Other
+legend System Total
+y_legend Number of Pages
+data_min 0
+plot_min 0
+color 00ff00
+color ff0000
+color 0000ff
+}
+
+plot {
+title %g Pages Locked & IO
+source percol
+data pageslock
+data pagesio
+legend Locked
+legend IO
+y_legend Number of Pages
+data_min 0
+plot_min 0
+}
Modified: trunk/orca/src/percol_running.pl.in
==============================================================================
--- trunk/orca/src/percol_running.pl.in (original)
+++ trunk/orca/src/percol_running.pl.in Sat Jul 13 18:36:53 2002
@@ -1,6 +1,6 @@
# not_running: warn if percollator files are not up to date.
#
-# Copyright (C) 1998 Blair Zajac and GeoCities, Inc.
+# Copyright (C) 1998, 1999 Blair Zajac and GeoCities, Inc.
use strict;
use POSIX qw(strftime);
Modified: trunk/orca/src/percol_column.pl
==============================================================================
--- trunk/orca/src/percol_column.pl (original)
+++ trunk/orca/src/percol_column.pl Sat Jul 13 18:36:53 2002
@@ -1,6 +1,6 @@
# percol_column: display selected columns from percollator output.
#
-# Copyright (C) 1998 Blair Zajac and GeoCities, Inc.
+# Copyright (C) 1998, 1999 Blair Zajac and GeoCities, Inc.
use strict;
Modified: trunk/orca/src/orca.pl
==============================================================================
--- trunk/orca/src/orca.pl (original)
+++ trunk/orca/src/orca.pl Sat Jul 13 18:36:53 2002
@@ -15,9 +15,15 @@
$Data::Dumper::Purity = 1;
$Data::Dumper::Deepcopy = 1;
-# This is the version of this code.
+# This is the version of Orca.
use vars qw($VERSION);
-$VERSION = 0.16;
+$VERSION = 0.17;
+
+# This is the version number used in creating the DS names in RRDs.
+# This should be updated any time a new version of Orca needs some
+# new content in its RRD files. The DS name is a concatentation of
+# the string Orca with this string of digits.
+my $ORCA_RRD_VERSION = 19990215;
# The number of seconds in one day.
my $day_seconds = 24*60*60;
@@ -112,13 +118,13 @@
warn "$0: warning: cannot close `$self->{_filename}': $!\n";
}
-package OpenFileHash;
+package Orca::OpenFileHash;
use Carp;
sub new {
unless (@_ == 2) {
- confess "$0: OpenFileHash::new passed wrong number of arguments.\n";
+ confess "$0: Orca::OpenFileHash::new passed wrong number of arguments.\n";
}
my $class = shift;
@@ -134,7 +140,7 @@
sub open {
unless (@_ == 3) {
- confess "$0: OpenFileHash::open passed wrong number of arguments.\n";
+ confess "$0: Orca::OpenFileHash::open passed wrong number of arguments.\n";
}
my ($self, $filename, $weight) = @_;
@@ -324,7 +330,7 @@
# Set up a cache of 150 open file descriptors. This leaves 255-150-3 = 102
# file descriptors for other use in the program.
use vars qw($open_file_cache);
-$open_file_cache = OpenFileHash->new(150) unless $open_file_cache;
+$open_file_cache = Orca::OpenFileHash->new(150) unless $open_file_cache;
package Orca::DataFile;
@@ -426,7 +432,7 @@
package Orca::GIFFile;
-use RRDs 0.99.0;
+use RRDs 0.99011;
use Carp;
sub new {
@@ -609,7 +615,7 @@
for (my $i=0; $i<$data_sources; ++$i) {
my $rrd_key = $self->{_my_rrd_list}[$i];
my $rrd_filename = $self->{_all_rrd_ref}{$rrd_key}->filename;
- push(@options, "DEF:source$i=$rrd_filename:orca:AVERAGE");
+ push(@options, "DEF:source$i=$rrd_filename:Orca$ORCA_RRD_VERSION:AVERAGE");
}
for (my $i=0; $i<$data_sources; ++$i) {
my $legend = ::replace_group_name($plot_ref->{legend}[$i], $group);
@@ -621,7 +627,8 @@
my $legend = ::replace_group_name($plot_ref->{legend}[$i], $group);
$legend =~ s:%:\200:g;
$legend =~ s:\200:%%:g;
- push(@options, "GPRINT:source$i:AVERAGE:Average $legend is %f");
+ push(@options, "GPRINT:source$i:AVERAGE:Average $legend is %f",
+ "GPRINT:source$i:LAST:Current $legend is %f");
}
my $gif_filename = "$self->{_gif_basename}-$plot_type.gif";
@@ -643,7 +650,7 @@
if (open(META, "> $gif_filename.meta")) {
my $time =
print META "Expires: ",
- expire_string($plot_end_time + $plot_age + 30),
+ _expire_string($plot_end_time + $plot_age + 30),
"\n";
close(META) or
warn "$0: warning: cannot close `$gif_filename.meta': $!\n";
@@ -657,7 +664,7 @@
1;
}
-sub expire_string {
+sub _expire_string {
my @gmtime = gmtime(shift);
my ($wday) = ('Sun','Mon','Tue','Wed','Thu','Fri','Sat')[$gmtime[6]];
my ($month) = ('Jan','Feb','Mar','Apr','May','Jun','Jul','Aug','Sep',
@@ -717,23 +724,50 @@
$self->{_plot_ref} = $plot_ref;
$self->{_interval} = int($config_files->{$files_key}{interval}+0.5);
- # If the file exists, then get the time of the last data point entered,
- # otherwise set the last update time to -2. If the file doesn't exist,
- # it is created later when the data is first flushed to it.
+ # See if the RRD file meets two requirements. The first is to see
+ # if the last update time can be sucessfully read. The second is
+ # to see if the RRD has an DS named "Orca$ORCA_RRD_VERSION". If
+ # neither one of these is true, then create a brand new RRD is
+ # created when data is first flushed to it.
$self->{_rrd_update_time} = -2;
if ($self->status >= 0) {
my $update_time = RRDs::last $rrd_filename;
if (my $error = RRDs::error) {
- warn "$0: RRDs::last error: $error\n";
+ warn "$0: RRDs::last error: `$rrd_filename' $error\n";
}
else {
- $self->{_rrd_update_time} = $update_time;
+ if (open(RRDFILE, "<$rrd_filename")) {
+ my $version = '';
+ while (<RRDFILE>) {
+ if (/Orca(\d{8})/) {
+ $version = $1;
+ last;
+ }
+ }
+ close(RRDFILE) or
+ warn "$0: error in closing `$rrd_filename' for reading: $!\n";
+
+ # Check the version number of file to the required version.
+ if ($version eq $ORCA_RRD_VERSION) {
+ $self->{_rrd_update_time} = $update_time;
+ }
+ elsif ($version) {
+ warn "$0: old version $version RRD `$rrd_filename' found: will create new version $ORCA_RRD_VERSION file.\n";
+ }
+ else {
+ warn "$0: unknown version RRD `$rrd_filename' found: will create new version $ORCA_RRD_VERSION file.\n";
+ }
+ }
}
}
$self;
}
+sub name {
+ $_[0]->{_name};
+}
+
sub rrd_update_time {
$_[0]->{_rrd_update_time};
}
@@ -795,7 +829,7 @@
# data source value is set to unknown.
my $interval = $self->{_interval};
- my $data_source = "DS:orca:$self->{_plot_ref}{data_type}";
+ my $data_source = "DS:Orca$ORCA_RRD_VERSION:$self->{_plot_ref}{data_type}";
$data_source .= sprintf ":%d:", 2*$interval;
$data_source .= "$self->{_plot_ref}{data_min}:";
$data_source .= "$self->{_plot_ref}{data_max}";
@@ -808,16 +842,18 @@
# RRA's with the same number of primary data points. This can happen
# if the interval is equal to one of the consoldated intervals.
my $count = int($rra_row_count[0]*300.0/$interval + 0.5);
- my $one_pdp_option = "RRA:AVERAGE:0.5:1:$count";
+ my @one_pdp_option = ("RRA:AVERAGE:0.5:1:$count",
+ "RRA:LAST:0.5:1:1");
for (my $i=1; $i<@rra_pdp_count; ++$i) {
next if $interval > 300*$rra_pdp_count[$i];
my $rra_pdp_count = int($rra_pdp_count[$i]*300.0/$interval + 0.5);
- if ($one_pdp_option and $rra_pdp_count != 1) {
- push(@options, $one_pdp_option);
+ if (@one_pdp_option and $rra_pdp_count != 1) {
+ push(@options, @one_pdp_option);
}
- $one_pdp_option = '';
- push(@options, "RRA:AVERAGE:0.5:$rra_pdp_count:$rra_row_count[$i]");
+ @one_pdp_option = ();
+ push(@options, "RRA:AVERAGE:0.5:$rra_pdp_count:$rra_row_count[$i]",
+ "RRA:LAST:0.5:$rra_pdp_count:1");
}
# Now do the actual creation.
@@ -831,7 +867,7 @@
RRDs::create @options;
if (my $error = RRDs::error) {
- warn "$0: RRDs::create error: $error\n";
+ warn "$0: RRDs::create error: `$rrd_filename' $error\n";
return;
}
}
@@ -886,7 +922,7 @@
$date_source,
$date_format,
$warn_email,
- $source_file_state) = @_;
+ $saved_source_file_state) = @_;
my $self = $class->SUPER::new($filename);
$self->{_interval} = $interval;
@@ -923,8 +959,8 @@
$self->{_read_interval} = int($read_interval + 0.5);
# Load in any state information for this file.
- if (defined $source_file_state->{$filename}) {
- while (my ($key, $value) = each %{$source_file_state->{$filename}}) {
+ if (defined $saved_source_file_state->{$filename}) {
+ while (my ($key, $value) = each %{$saved_source_file_state->{$filename}}) {
$self->{$key} = $value;
}
}
@@ -1354,7 +1390,7 @@
push(@{$config_plots->[$old_i]{creates}}, $gif);
}
- # Put into each RRD the GIFS that are generated from it.
+ # Put into each RRD the GIFs that are generated from it.
foreach my $rrd_key (@my_rrds) {
$rrd_data_files_ref->{$rrd_key}->add_gif($gif);
}
@@ -1443,8 +1479,18 @@
my $close_once_done = 0;
while (my $line = <$fd>) {
my @line = split(' ', $line);
+
+ # Skip this input line if 1) the file uses the first line to define the
+ # column names, 2) the number of columns loaded is not equal to the
+ # number of columns in the column description.
+ if ($self->{_first_line} and @line != @{$self->{_column_description}}) {
+ warn "$0: number of columns in line $. of `$filename' does not match column description.\n";
+ next;
+ }
+
my $time = $use_file_mtime ? $self->file_mtime : $line[$date_column_index];
$last_data_time = $time if $time > $last_data_time;
+
# If the file status from the source data file is greater than zero, then
# it means the file has changed in some way, so we need to do updates for
# all plots. Load the available data and push it to the plots.
@@ -1595,7 +1641,7 @@
my $gif_files_ref = {list => [], hash => {}};
# Load the current state of the source data files.
- my $source_file_state = &load_state($config_options->{state_file});
+ my $saved_source_file_state = &load_state($config_options->{state_file});
# The first time through we always find new files. Determine the
# time interval that the current time is in, where the intervals are
@@ -1633,7 +1679,7 @@
$config_options,
$config_files,
$config_plots,
- $source_file_state,
+ $saved_source_file_state,
$old_found_files_ref,
$rrd_data_files_ref,
$gif_files_ref);
@@ -1641,16 +1687,12 @@
# Go through all of the groups and for each group and all of the files
# in the group find the next load time in the future.
undef %group_load_time;
- my $now = time;
foreach my $group (keys %$group_files_ref) {
my $group_load_time = 1e20;
foreach my $filename (@{$group_files_ref->{$group}}) {
my $load_time = $new_found_files_ref->{$filename}->next_load_time;
$group_load_time = $load_time if $load_time < $group_load_time;
}
- if ($group_load_time < $now) {
- die "$0: internal error: group_load_time less than current time.\n"
- }
$group_load_time{$group} = $group_load_time;
}
}
@@ -1715,14 +1757,14 @@
$updated_source_files = 1;
# Flush the data that has been loaded for each plot. To keep the
- # RRD that was just created in the systems cache, plot GIFs that
+ # RRD that was just created in the system's cache, plot GIFs that
# only depend on this RRD, since GIFs that depend upon two or more
# RRDs will most likely be generated more than once and the other
# required RRDs may not exist yet.
if ($opt_verbose) {
print "Flushing new data", $group ? " from $group" : "", ".\n";
}
- foreach my $rrd (values %this_group_rrds) {
+ foreach my $rrd (sort {$a->name cmp $b->name} values %this_group_rrds) {
$rrd->flush_data;
foreach my $gif ($rrd->created_gifs) {
next if $gif->rrds > 1;
@@ -1784,12 +1826,42 @@
}
}
+# Take a string and capatialize only the first character of the
+# string.
sub Capatialize {
my $string = shift;
substr($string, 0, 1) = uc(substr($string, 0, 1));
$string;
}
+# Sort group names depending upon the type of characters in the
+# group's name.
+sub sort_group_names {
+ my $a_name = ref($a) ? $a->group : $a;
+ my $b_name = ref($b) ? $b->group : $b;
+
+ # If both names are purely digits, then do a numeric comparison.
+ if ($a_name =~ /^[-]?\d+$/ and $b_name =~ /[-]?\d+$/) {
+ return $a_name <=> $b_name;
+ }
+
+ # If the names are characters followed by digits, then compare the
+ # characters, and if they match, compare the digits.
+ my ($a_head, $a_digits, $b_head, $b_digits);
+ if (($a_head, $a_digits) = $a_name =~ /^([-a-zA-Z]+)(\d+)$/ and
+ ($b_head, $b_digits) = $b_name =~ /^([-a-zA-Z]+)(\d+)$/) {
+ my $return = $a_head cmp $b_head;
+ if ($return) {
+ return $return;
+ }
+ else {
+ return $a_digits <=> $b_digits;
+ }
+ }
+
+ $a_name cmp $b_name;
+}
+
# Create all of the different HMTL files with all of the proper HREFs
# to the GIFs.
sub create_html_files {
@@ -1807,9 +1879,9 @@
# Create the main HTML index.html file.
my $index_html = Orca::HTMLFile->new($index_filename,
- $config_options->{html_top_title},
- $config_options->{html_page_header},
- $config_options->{html_page_footer});
+ $config_options->{html_top_title},
+ $config_options->{html_page_header},
+ $config_options->{html_page_footer});
unless ($index_html) {
warn "$0: warning: cannot open `$index_filename' for writing: $!\n";
return;
@@ -1830,7 +1902,7 @@
# Go through each group.
if (keys %$group_files_ref > 1) {
$index_html->print("<h2>Available Targets</h2>\n\n<table>\n");
- foreach my $group (sort keys %$group_files_ref) {
+ foreach my $group (sort sort_group_names keys %$group_files_ref) {
# Create the HTML code for the main index.html file.
my $group_basename = strip_key_name($group);
@@ -1857,9 +1929,9 @@
my $filename = "$html_dir/$href";
my $Plot_Type = Capatialize($plot_type);
my $fd = Orca::HTMLFile->new($filename,
- "$Plot_Type $group",
- $config_options->{html_page_header},
- $config_options->{html_page_footer});
+ "$Plot_Type $group",
+ $config_options->{html_page_header},
+ $config_options->{html_page_footer});
unless ($fd) {
warn "$0: warning: cannot open `$filename' for writing: $!\n";
next;
@@ -1975,9 +2047,12 @@
}
# Put together the correctly ordered list of GIFs using the array
- # references in the legends hash.
+ # references in the legends hash. Sort the GIFs using the
+ # special sorting routine for group names.
my @gifs;
foreach my $legend_no_group (sort keys %same_legends_gif_list) {
+ @{$same_legends_gif_list{$legend_no_group}} =
+ sort sort_group_names @{$same_legends_gif_list{$legend_no_group}};
push(@gifs, @{$same_legends_gif_list{$legend_no_group}});
}
@@ -2005,9 +2080,9 @@
my $filename = "$html_dir/$href";
my $Plot_Type = Capatialize($plot_type);
my $fd = Orca::HTMLFile->new($filename,
- "$Plot_Type $legend_no_group",
- $config_options->{html_page_header},
- "<hr>\n$config_options->{html_page_footer}");
+ "$Plot_Type $legend_no_group",
+ $config_options->{html_page_header},
+ "<hr>\n$config_options->{html_page_footer}");
unless ($fd) {
warn "$0: warning: cannot open `$filename' for writing: $!\n";
next;
@@ -2067,9 +2142,9 @@
$gif->group);
my $summarize_name = "$html_dir/$with_group_name.html";
my $summarize_html = Orca::HTMLFile->new($summarize_name,
- $legend_with_group,
- $config_options->{html_page_header},
- $config_options->{html_page_footer});
+ $legend_with_group,
+ $config_options->{html_page_header},
+ $config_options->{html_page_footer});
unless ($summarize_html) {
warn "$0: warning: cannot open `$summarize_name' for writing: $!\n";
next;
@@ -2232,7 +2307,7 @@
$config_options,
$config_files,
$config_plots,
- $source_file_state,
+ $saved_source_file_state,
$old_found_files_ref,
$rrd_data_files_ref,
$gif_files_ref) = @_;
@@ -2280,14 +2355,16 @@
$tmp_group_by_file{$filename} = $group;
}
- # Create a new list of filenames sorted by group name and inside each
- # group sorted by filename. This will cause the creates plots to
- # appear in group order.
+ # Create a new list of filenames sorted by group name and inside
+ # each group sorted by filename. This will cause the created
+ # plots to appear in group order.
@filenames = ();
foreach my $key (sort keys %tmp_files_by_group) {
push(@filenames, sort @{$tmp_files_by_group{$key}});
}
+ # Now for each file, create the Orca::SourceDataFile object that
+ # manages that file and the GIFs that get generated from the file.
foreach my $filename (@filenames) {
# Create the object that contains this file. Take care if the same
# file is being used in another files group.
@@ -2306,7 +2383,7 @@
$config_files->{$files_key}{date_source},
$config_files->{$files_key}{date_format},
$config_options->{warn_email},
- $source_file_state);
+ $saved_source_file_state);
unless ($data_file) {
warn "$0: warning: cannot process `$filename'.\n";
next;
Modified: trunk/orca/CHANGES
==============================================================================
--- trunk/orca/CHANGES (original)
+++ trunk/orca/CHANGES Sat Jul 13 18:36:53 2002
@@ -1,10 +1,78 @@
+Tue Feb 16 10:49:07 PST 1999
+
+ Version 0.17.
+
+ Sort group names depending upon the format of the group names.
+ This now allows nfs13 to be listed after nfs5, even though cmp
+ will list nfs13 before nfs5.
+
+ Created GIFs now also print the current measured value in addition
+ to the average measured value.
+
+ Update the http and inode with page steals plots in
+ percollator.cfg.in to have shorter legends so that RRDtool would
+ plot the whole comment along with the average and last value in
+ the GIF.
+
+Mon Feb 15 12:29:18 PST 1999
+
+ Require RRDtool 0.99.11.
+
+ Add version information to the DS names created by Orca so new
+ version of Orca and make older RRD files obsolete. This feature
+ now used to make sure LAST measurements are stored in Orca's RRDs.
+
+Fri Feb 12 15:10:05 PST 1999
+
+ Require RRDtool 0.99.10.
+
+ When creating a brand new RRD, create new RRAs for the last
+ value measured using LAST.
+
+Thu Feb 11 11:48:14 PST 1999
+
+ Require RRDtool 0.99.8.
+
+Thu Feb 4 12:51:54 PST 1999
+
+ Update the installation instructions for SE 3.1 beta.
+
+Fri Jan 29 15:38:53 PST 1999
+
+ Require RRDtool 0.99.6.
+
+Thu Jan 28 15:44:07 PST 1999
+
+ If a file uses the first line as a column description and if
+ the following line does not have the same number of columns,
+ then warn and skip that line.
+
+ Require RRDtool 0.99.4.
+
+Tue Jan 26 15:28:26 PST 1999
+
+ Rename Orca::GIFFile::expire_string to
+ Orca::GIFFile::_expire_string.
+
+ Change the default location of the percollator RRDs to be
+ $prefix/var/orca/rrd/percollator instead of $prefix/var/orca/rrd.
+
+ Sort the order of RRD files being updated by the RRD's filename.
+
+ Do not make RRD_DIR/percollator
+
+ Require RRDtool 0.99.2.
+
+ Remove sanity check on group update times that could cause
+ false dies.
+
Tue Jan 26 10:07:40 PST 1999
Fix a bug in lib/Makefile.in where the variable $(CP) was used
but never defined.
- Fix a bug where watch_data_sources() would do an extra loop
- each time file updates were looked for.
+ Fix a bug where watch_data_sources() would do an extra loop each
+ time file updates were looked for.
Version 0.16.
@@ -16,16 +84,16 @@
Update the URL to Tobias Oetiker's RRD web site.
- Make sure lib/Makefile does not overwrite old lib/percollator.cfg's
- when a make install is done.
+ Make sure lib/Makefile does not overwrite old
+ lib/percollator.cfg's when a make install is done.
Version 0.15.
Fri Jan 22 20:17:56 PST 1999
- Create an Orca logo and have the logo appear at the bottom
- of each HTML file. The logo is stored as hexidecimal inside
- the Orca perl script.
+ Create an Orca logo and have the logo appear at the bottom of
+ each HTML file. The logo is stored as hexadecimal inside the
+ Orca perl script.
Update percollator.cfg.in to not contain bzajac at geostaff.com
any more. Changed it to root at localhost.
@@ -35,9 +103,9 @@
Add a configure script to Orca to make installation of Orca and
percollator much easier.
- Add a new file env_percol to set up all the environmental variables
- so that percollator runs correctly instead of having to edit
- start_percol, stop_percol, and restart_percol.
+ Add a new file env_percol to set up all the environmental
+ variables so that percollator runs correctly instead of having
+ to edit start_percol, stop_percol, and restart_percol.
Mon Jan 11 16:58:28 PST 1999
More information about the Orca-checkins
mailing list