Project Management Capability Levels: An Empirical Study Tom McBride University of Technology, Sydney mcbride@it.uts.edu.au
Brian Henderson-Sellers University of Technology, Sydney brian@it.uts.edu.au
Abstract This paper outlines existing maturity models of project management and their underlying constructs. Organizations involved in software development in Sydney, Australia were interviewed about their project management practices and their responses analysed to determine whether different project managers used different levels of project management practices and whether the practices were in accordance with a process based maturity model. This did not seem to be the case, yet the data suggested that, as a possible alternative, a systems theory based approach might be more tenable. The overall conclusion, that a system theory based maturity model appears to be better correlated with organizational size and software development maturity than a process based maturity model, is briefly discussed and additional research is suggested that could investigate this novel conclusion further.
1
Introduction
Maturity models of software development processes are useful because they indicate different levels of performance (of the processes) and hence the direction for software process improvement (SPI). The most well known of them, the CMM, was developed in response to a request to provide the USA federal government with a method for assessing its software contractors [8]. However, maturity models such as those that underlie the CMM’s succes sor, CMMI (Integrated Capability Maturity Model) [9] or SPICE (Software Process Improvement and Capability dEtermination) [6] may not be the best models of maturity for management processes such as project management, as opposed to the technical processes of developing the software, nor are they necessarily the best model for distributed, globalised software development. Rather than simply seeking evidence of
Didar Zowghi University of Technology, Sydney didar@it.uts.edu.au
conformance to a particular model of maturity or project management, this research first seeks to establish the work practices of a sample of project managers, and then to deduce if those work practices conform to a pattern of increasing sophistication or maturity. As noted above, maturity scales require an underlying model. These are generally devised according to a view of how the desired results ought to be achieved. Depending upon which model is assessed, an organization might rate higher or lower. For example, if multi-lingual skills were thought essential to international business, then organizations whose personnel possessed those skills are likely to be assessed at a higher maturity level than if the underlying model did not consider multi-lingual skills to be important. Well-known maturity models (outlined in Section 2) such as CMMI and SPICE have an underlying process model that views software development activities in an industrial production-like fashion, focusing attention on the flow of work from one process to another. Alternative views, such as systems theory [1, 10, 12], focus attention on different aspects of software development, project management in particular (Section 5). To investigate if the monitoring and management activities of project managers conform to a capability scale, the following research question was proposed: Do the monitoring and controlling activities of project managers conform to a process based capability model such as SPICE? Evidence was gathered (Section 3) to establish how project managers monitor and control their software development projects and then the responses were examined against the hypothesis. It was found that, although there was some support for a maturity model of project management, the support was not for the expected process based model (as is assumed for CMMI and SPICE) but a systems theory based model (Section 4). Threats to validity are considered in
Proceedings of the 11th Asia-Pacific Software Engineering Conference (APSEC’04) 1530-1362/04 $ 20.00 IEEE
Section 5, with Section 6 offering conclusions and pointers to future research.
2
Maturity Models
Process maturity models generally describe a collection of processes relevant to the area of interest and a scale by which increasing maturity may be assessed. Maturity is associated with organizations whose processes are capable of producing better outcomes and were originally observed to be actioned in the more "mature" organizations. Capability is assessed as giving some indication of such maturity and is generally evaluated relative to efficient performance, i.e. the shortest completion time for nominated requirements with fewest defects. The assessment is of the organization's ability to perform the processes rather than any characteristics of the processes themselves i.e. the measure is not of the quality of the process as formulated in the “process handbook” but a measure of the enactment of that process. A universal scale of capability has yet to be established. The original SEI Software Capability Maturity Model (SW-CMM) was based on principles that have existed "for the last sixty years" [8]. The maturity framework was originally described by Crosby [3] as having five evolutionary stages in the adoption of quality practices and this framework was later adapted to software processes. The SPICE capability scale reflects much of the theory of Total Quality Management (TQM) with its orientation toward statistical process control. The capability model proposed by Ibbs and Kwak [5] uses statistical relationships between project management maturity and project performance. While each of these has differing views of the details of the capability scale, there is broad consensus that at higher capability levels the processes are to be performed with greater rigour. The most well known capability models in software engineering are the Integrated Capability Maturity Model (CMMI) [9] and the ISO originated Software Process Improvement and Capability dEtermination model (SPICE) [6]. Other capability models have also been developed but are outside the scope of the present paper. The dominant text for project management, the PMBOK [4], separates the whole of project management into eight knowledge areas. Activities from each knowledge area are performed as required at various times during a project. Each knowledge area is divided into the phases of initiating, planning,
executing, controlling and closing, which reflect the familiar sequential arrangement. However, the knowledge areas are not presented as processes and don't have the same production orientation of the SPICE and CMMI process models – for example, Ibbs and Kwak [5] developed their maturity model of project management to better understand the financial and organizational benefits of using project management tools and practices in organizations. Rather than being confined to software development, their model was developed from information gathered from a range of industries. Increasing levels of maturity appear to be based on performing key activities but with greater thoroughness and rigour as the maturity levels increase. This is in contrast to both CMMI and SPICE where increasing levels of capability are achieved through performing more, and different, tasks.
3
Research method
To investigate whether project managers’ activities conform to a process based capability model, structured interviews were conducted with project managers from a number of software development organizations in Sydney, Australia between February and September 2003. Organizations were approached by phone initially and asked if there was a project manager involved in software development who was willing to be interviewed. Structured interviews allowed questions and responses to be clarified or amplified during the interview and also allowed for unexpected information and findings to emerge rather than directing responses to preconceived models. There were 49 questions asked in the structured interviews. Of these • 4 questions categorized the organization and its software development processes, • 7 questions established how the project manager monitored the project, • 3 questions established how the organization adjusted the project (scope, schedule, quality requirements, performance requirements) as a consequence of monitoring the project, • 8 questions established an approximate measure of organizational distance, defined as the administrative, geographical and cultural separation between the sections of the project team, • 5 questions es tablished project monitoring processes for outsourced tasks and
Proceedings of the 11th Asia-Pacific Software Engineering Conference (APSEC’04) 1530-1362/04 $ 20.00 IEEE
•
2 questions established how the outsourced project tasks were managed in response to information revealed by project monitoring. An expected range of responses was developed for each of the 49 questions, both to guide the questioning and responses, and to help guide later analysis. This range of responses was not shown to the interviewee but used to indicate the scope of the information sought. For example, one of the questions and the expected responses was Is there a standard method or process for monitoring project tasks? No – each project manager does their own thing. Yes, but informal and flexible. Yes, defined but not very extensive. Yes, defined and extensive. This reduced the tendency to answer in more detail than was intended. Questions were generally of two types. The first was intended to establish the organization’s position on some scale. For example, a question about the size of the organization was intended to establish if they were small, medium, large or multinational. Similarly, a question on the formality of their software development processes was intended to establish approximately where they would likely be assessed on a CMMI or SPICE assessment of process maturity. The second type of question was more open and designed to elicit information on, for example, the range of subjects discussed at an internal project meeting. This type of question in surveys would usually have a space to respond with “Other” so that respondents could expand on any other relevant issues . Each interview took between 30 minutes to just over an hour, depending on the loquaciousness of the interviewee. Most lasted about 45 minutes and were conducted at the interviewee’s worksite. They were audio taped and later transcribed. The transcript was sent back to the interviewee to check and correct. The interview responses were then encoded and analysed using the statistical evaluation package, SPSS 11.0. Since the encoded variables were nominal or ordinal, the only appropriate statistical test was Chi-square if both variables were of a nominal type, and Kendall’s tau-b if one of the variables was of an ordinal type. Organizations face a number of demands on their time and resources, and academic research, no matter how well intended or potentially beneficial, must compete for the organization’s time and willingness. To help persuade project managers to participate in this current research, an offer was made to email the list of
questions to them prior to the interview and to send a report of the findings once the study was complete.
3.1
Sample characteristics
Organizational size Organizational size was judged largely on the number of personnel. This estimate included the whole organization, not just the software development part of it, because past experience indicates that a small division within a large organization more closely resembles the large organization than a small, independent company of similar size to the division. Table 1 gives the distribution of organization size. The size divisions were chosen because they reflect approximately where organizations tend to change structure, from direct supervision through simple, single layer management through to multi layer management. Table 1: Organizational size Small (< 30 staff)
12
Medium (31 – 120)
4
Large (>120 - 1000 single organization)
3
Multinational (> 1000 or Multinational)
12
Total
31
Process maturity The process maturity is a very approximate guide based on the ISO 15504 (SPICE) or CMMI scale of process maturity. The first author is familiar with, and practised at, such process assessments. These ratings would be the equivalent of a very low rigour SPICE assessment. The single instance of a maturity level of 5 (Table 2) came from an actual CMMI assessment. Organizations were adjudged at level 3 if they were ISO 9001 accredited or had undergone a SPICE or CMMI assessment and had achieved that rating. Level 2 was assigned if the organization had documented software development processes, particularly those dealing with project management and document control. Table 2: Process maturity Informal - Level 1
6
Managed - Level 2
8
Defined - Level 3
16
Measured - Level 4
0
Optimizing - Level 5
1
Total
31
Proceedings of the 11th Asia-Pacific Software Engineering Conference (APSEC’04) 1530-1362/04 $ 20.00 IEEE
3.2
Project Control Techniques
Project managers were asked if they would drop functionality, or move it to a later release, or if they would compromise quality goals or performance goals in order to meet the delivery schedule. The responses are summarized in Table 3-5. One of the more interesting and revealing responses on this topic was that, although the quality of the actual delivered executable system would not be compromised, there was always the capacity to reduce the quality of associated documentation, either internally in the code or externally in reference manuals or technical manuals. Similarly, time can be saved by reducing the rigour of code or documentation reviews. Table 3: Is functionality dropped to meet the delivery schedule?
Always retained
Frequency
Percent
5
16.1
Engineers decide
1
3.2
Project Review board decides
3
9.7
Marketing decides
2
6.5
Negotiated with stakeholders
19
61.3
No response
1
3.2
Total
31
100
Table 4: Are quality goals compromised to meet the delivery schedule?
Negotiated with stakeholders
10
No response
2
6.5
Total
31
100
3.3
Project monitoring
Project managers were asked how they monitored the project. In fact they were each asked “How do you know if the project is going well?” and “How do you know if the project is going badly?” The responses were grouped in common techniques of • expert judgment • progress measure • earned value measure • risk monitoring • defect monitoring • test results • other. There was no statistical relationship between any one of the monitoring techniques and either of the two variables of “organization size” or “process maturity”. However, ti seemed, on first consideration, that the more experienced project managers used several monitoring techniques, as shown in Table 6. Pearson’s Chi-square test returned an asymptotic significance of 0.054 leading to the conclusion that there was a weak correlation between organizational process maturity and the number of monitoring techniques used by project managers. Table 6: Count of monitoring techniques vs. process maturity
Frequency
Percent
Always retained
12
38.7
2
3
6
19.4
Process Maturity
1
Engineers decide Project Review board decides
1
3.2
Informal
1
3
2
marketing decides
2
6.5
Managed
5
Negotiated with stakeholders
9
29.0
Defined
Count of monitoring techniques used.
No response
1
3.2
Optimizing
Total
31
100
Total
Table 5: Are performance goals compromised to meet the delivery schedule? Frequency
32.3
3.4
1
2
2
5
6
9
4
5
6 6 8
7
1
1
16
1
1
31
1 6
Total
8
1
Higher levels of project management capability
Percent
No performance goals set
5
16.1
Always retained
9
29.0
Engineers decide
2
6.5
Project Review board decides
1
3.2
Marketing decides
2
6.5
Respondents were asked questions that would have given a very approximate indication of higher levels of project management process capability on a SPICE or CMMI scale.
Proceedings of the 11th Asia-Pacific Software Engineering Conference (APSEC’04) 1530-1362/04 $ 20.00 IEEE
Defined process A question was directed at establishing whether or not the organization had a defined project management process. The existence and use of a defined process is one of the outcomes required for a SPICE level 3 process. With it, an organization can achieve level 3; without it, they cannot. The table of results for organization size vs defined project management process is shown in Table 7. Table 7: Organization size vs. Defined PM process
Small
7
3
Medium
2
2
Large
2
12 4
1
Multinational
3
Total
12
Total
Formal, extensive
Formal, not extensive
Informal method
No process
Project monitoring process
6
2
3
3
6
12
5
8
31
Optimizing Total
21
5
1
1
4
30
A Pearson Chi-square test shows no correlation between Organizational size and PM Time Monitoring (asymptotic significance 0.452) but shows a correlation between Organizational Maturity and PM Time Monitoring (asymptotic significance 0.040). Optimizing Process SPICE level 5 is measured by the degree to which an organization anticipates process improvements due to changes in techniques and technology appropriate to the process and deploys selected improvements throughout the organization. As an indication of this, respondents were asked how they became better at project management, e.g. was this achieved mainly through training or some other means such as a formal process improvement initiative. The results are shown in Table 9 and Table 102. Table 9: Organizational size vs. PM Process Improvement PM Process Improvement
A Pearson Chi-square test shows a correlation between Organization size and Defined PM process (asymptotic significance of 0.045) but not between Process Maturity and Defined PM process (asymptotic significance 0.232). Measured Process SPICE level 4 capability concerns the degree to which objective measures are used to control the process performance and, specifically, to detect the sources of process faults and inefficiencies. As an example of a process measure applicable to project management, respondents were asked if they recorded how much time they spent on different aspects of project management. The responses are shown in Table 81. Table 8: Organizational Maturity vs. PM Time Monitoring PM Time Monitoring No Informal
6
Managed
7
Defined
8
1
Broad
Total
Detailed 6
5
1
8
2
15
There are only 30 data points in this two tables because one organization gave no response.
Nothing special
Training
Small
7
Medium
1
1
Large
2
1
Multinational
2
Total
12
Total
Other 5
12
2
4
3
5
10
5
12
29
3
Table 10: Organizational Maturity vs. PM Process Improvement PM Process Improvement Nothing special Informal
Training
3
Managed
5
Defined
3
Optimizing
1
Total
12
Total
Other 3
6
3
7
5
6
14
5
12
29
1
Neither Organizational size nor Organizational Maturity were correlated with Project Management Process Improvement (Pearson Chi-square asymptotic significance of 0.408 and 0.216 respectively).
2
There are only 29 data points in these two tables because two organizations gave no response.
Proceedings of the 11th Asia-Pacific Software Engineering Conference (APSECâ&#x20AC;&#x2122;04) 1530-1362/04 $ 20.00 IEEE
3.5
Process based capability model
As can be seen from the statistical analysis of the data shown in the tables in Section 3.4, there is very little evidence that project management activities conform to a process based capability maturity model such as SPICE or CMMI.
4
Systems theory capability model
As a consequence of the observed lack of support for a process based capability model, systems theory was reviewed and a capability model was developed based on systems theory. The interview data were reexamined for evidence of constraint, feedback and project-directing activities. It is acknowledged that such post interview hypothesizing is weakly valid and any results must be treated as indications of a relationship requiring more rigorous data before claiming any support for such a relationship. Systems theory is founded on two pairs of ideas, those of emergence and hierarchy, and communication and control [2]. Systems may be decomposed into a hierarchy of sub-systems, each more complex than the one below it in the hierarchy. Each level of the hierarchy is characterized by emergent properties that that do not exist at the lower levels. Leveson [7] gives the example of the shape of an apple “although eventually explainable in terms of the cells of an apple, [the shape] has no meaning at that lower level of description.” The operation of the processes at the lower level, that of the biology of the apple, “result in a higher level of complexity – that of the whole apple itself – that has emergent properties, one of which is the apple’s shape.” Software development, viewed from the perspective of systems theory, places the project manager between the organization’s executive and the development team in the organizational hierarchy. The project manager places constraints on the development activities and on the developed project, controls the development activities and receives feedback about the development as well as emergent phenomena of, among other things, coordination between the various development activities. Determining whether a project is coordinated cannot be done by examining one of the project’s activities. It requires examining the relationships between the activities within the constraints imposed by the project management level before being able to determine the degree to which the project is coordinated.
Budget, available personnel and their expertise, delivery schedules, available tools and technologies, quality requirements are all examples of development constraints. Some would be set by the level in the hierarchy above that of the project manager, i.e., the organization’s executive; other constraints would emerge as part of requirements elicitation. Still others, such as the organizational culture or the development team’s professional culture, are typically an assumed part of the work environment. Obviously, there is the possibility that some constraints contradict each other and the project manager needs to decide how to resolve such contradictions. In other cases, the constraint will be a soft constraint, such as the increased error rate in developed work as daily work hours increase, rather than a hard constraint such as the available budget or the development team’s expertise. The project manager, among other things, directs development by assigning personnel, scheduling the work and communicating information to where it is needed. Feedback would be sought from a range of sources for a range of purposes. Obviously, the project manager needs to know how the project is progressing but also needs feedback on whether the development team’s expertise is sufficient to complete the development and whether the customer perceives their concerns are being heeded, as well as on a whole range of similar issues. The project manager’s actions are unlikely to be uniform in response to the project characteristics and are, instead, more likely to exhibit some form of capability scale. Some of the capability is likely to be attributable to the demands of the project and some due to the project manager’s individual skill. There are a range of alternative capability scales. One is that the capabilities will conform to the CMMI and SPICE scale, based on increasing management control and process repeatability. Another possible capability scale is that key activities will be performed with increasing complexity or increasing rigour as demonstrated in Ibbs and Kwak’s project management capability model [5]. Within the context of systems theory, we believe activities are likely to be grouped around various constraints: those concerning feedback and those concerning directing the work. If the systems model were adopted, then project management would improve, or become more capable, as a result of the project manager’s endeavours as he/she: • Established and incorporated the project's constraints into the project strategy and subsequent plan during project planning.
Proceedings of the 11th Asia-Pacific Software Engineering Conference (APSEC’04) 1530-1362/04 $ 20.00 IEEE
•
Modified constraints during project planning to better achieve the project outcomes. Established sources of feedback. Established the type of feedback, subjective, objective, political, etc. Monitored the feedback during the project. Monitored the constraints during the project. Modified constraints during the project. Established and practised control action.
• • • • • •
4.1
Principles of capability
A general principle was proposed that greater capability would be associated with greater awareness and greater complexity. A project manager who actively sought out and tested a number of project constraints would be judged more capable than a project manager who was unaware of the project’s constraints. Seeking multiple sources and different types of feedback would demonstrate greater capability than seeking a minimal number of feedback sources. Using such principles, a scale of capability emerged from the data analysis. An example scale for one project management activities is given in Table 11. Similar capability scales were developed for each of the attributes.
4.2
Construct validity
The interview data were re-examined to assign a capability level to each project manager for each of the project management activities described above. All items in the capability model appear to be highly correlated.
4.3
Correlations
For the purposes of analysis, activities were separated into those which would be performed during project planning and those which would be performed while the project was in progress. The measures of each activity were then correlated against organization size, process maturity and project size. Both organization size and process maturity were highly correlated with all capability attributes but project size was only weakly correlated with capability attributes, as shown in Table 12. Table 12: Kendall’s tau-b correlations Org size
Process Project size maturity
Establish constraints
.614**
.624**
.339*
Alter constraints
.563**
.541**
.409**
.488**
.277
Sources of feedback
.507**
Table 11: Example of capability attribute – Modify contstraints during planning.
Type of feedback
.567**
.559**
.240
Monitor feedback
.462**
.546**
.358*
Determine if some constraints can or must be changed. • Product related: Requirements, budget, schedule
Monitor constraints
.623**
.689**
.477**
Modify constraints
.525**
.528**
.315
Controlling actions
.682**
.670**
.409**
• •
Personnel related: expertise, teamwork, social issues. Infrastructure: tools, logistics
Scale
Indications
1
No interest. Accept the given constraints.
2
Some investigation into product related constraints.
3
Product related and some personnel related. Actively investigate constraints in limited areas
4
Limited investigation into constraints other than product and personnel
5
Knowledgeable investigating all constraints. Product, personnel, political, infrastructure, technology
* Correlation is significant at the .05 level (2-tailed). ** Correlation is significant at the .01 level (2tailed). It is readily evident that there is strong correlation between organization size and systems theory based project management capability, and between organization process maturity and system theory based project management capability but weaker evidence of a correlation between project size and a systems theory based project management capability.
5
Threats to validity
Small sample size. The sample was relatively small at 29 and many statistical tests suffered from having insufficient cell counts, usually less than 10. Non random sample. The participating organizations were those listed in the Sydney,
Proceedings of the 11th Asia-Pacific Software Engineering Conference (APSEC’04) 1530-1362/04 $ 20.00 IEEE
Australia, Yellow Pages who agreed to be interviewed when approached by telephone. Such accidental sampling is considered to have very weak external validity and likely to be biased [11]. Weak external validity. Organizations with low maturity, chaotic project management processes are less likely to be willing to reveal to a researcher just how they manage projects – or, rather, don’t manage them. Consequently the findings of this research are likely to be biased toward the more mature organizations. However, given the conclusions, the weak external validity is of less importance. Localized sample. The research sample was taken from organizations in Sydney, Australia. While there were a significant number of multinational organizations in the sample, it is possible that the same research findings are similarly localized. The study would need to be replicated in another country to test this. Post analysis hypothesis. Hypothesizing that Systems Theory may be applicable as the basis for a model of project management capability was decided after the interview data had been gathered and analysed. Any conclusion from the subsequent analysis must be regarded as a possible indication of some relationship rather than proof of such a relationship. It could form the basis of a future study to explore the finding in greater depth and rigour.
6
Conclusion and further research
Capability models have been used very successfully to guide process improvement and to provide indications of an organization’s maturity but their use of the same set of higher capability level activities for all processes has not previously been tested. While a process based project management model had no correlation to the SPICE model, the systems theory based capability model appears to correlate well with the kinds of activities performe d by software development project managers. The research strongly suggests that a Systems Theory view of project management is more likely to accurately reflect what software development project managers actually do to monitor and control the project and provide a strong capability scale for those activities. Indeed, a capability scale based on increasing scope and increasing complexity of the performed activities appears to apply to the activities actually performed by project managers. A capability scale based on increasing scope and complexity of Systems Theory based activities is also highly
correlated to both organization size and organizational process maturity. Since these conclusions suffer from weak validity, both internal and external, they need to be explored and validated by further research specifically directed at a correlation between a systems theory model of project management capability and the practices actually performed by project managers.
7
Acknowledgments
This is Contribution numb er 04/11 of the Centre for Object Technology Applications and Research (COTAR).
8
References
[1] P. Checkland, "Systems Thinking and Management Thinking," American Behavioral Scientist, vol. 38, pp. 75(17), 1994. [2] P. Checkland, Systems Thinking, Systems Practice. Chichester: John Wiley & Sons, 1981. [3] P. B. Crosby, Quality is Free: The Art of Making Quality Certain: McGraw-Hill, 1979. [4] W. R. Duncan, "A Guide to the Project Management Body of Knowledge," Project Management Institute 1996. [5] C. W. Ibbs and Y. H. Kwak, "Assessing Project Management Maturity," Project Management Journal, vol. 31, pp. 32-43, 2000. [6] ISO 15504:1998 - Information Technology Software Process Assessment [7] N. Leveson, "A New Accident Model for Engineering Safer Systems," presented at MIT Engineering Systems Division Internal Symposium, Boston, MIT, 2002. [8] SEI, "Capability Maturity Model for Software (Version 1.1)," Software Engineering Institute, Pittsburgh CMU/SEI -93-TR-024, 1993. [9] SEI, "CMMI for Systems Engineering/Software Engineering, Version 1.02," Carnegie Mellon University/Software Engineering Institute, Pittsburgh CMU/SEI -2000-TR-019, 2000. [10] L. Skyttner, General Systems Theory: Ideas & Applications. Singapore: World Scientific, 2001. [11] W. M. K. Trochim, The Research Methods Knowledge Base. Cincinnati: Atomic Dog Publishing, 2001. [12] G. M. Weinberg, Introduction to Systems Thinking , 3 ed. New York: Dorset House, 2001.
Proceedings of the 11th Asia-Pacific Software Engineering Conference (APSEC’04) 1530-1362/04 $ 20.00 IEEE
Proceedings
Asia- Pacific Software Engineering Conference 30 November - 3 December 2004 Haeundae Grand Hotel, Busan, Korea
Sponsored by Korea Information Science Society
Proceedings
11th Asia-Pacific Software Engineering Conference November 30 - December 3,2004 Busan, Korea Sponsored by Korea Information Science Society In cooperation with Korea Science Engineering Foundation Samsung Electronics Samsung SDS LG Electronics Korea IT Industry Promotion Agency Electronics and Telecommunications Research Institute
IEEE~
COMPUTER SOCIETY http:// computer.org
Los Alamitos California I
Washington
•
Brussels
•
Tokyo
Copyright Š 2004 by The Institute of Electrical and Electronics Engineers, Inc. All rights reserved Copyright and Reprint Permissions: Abstracting is permitted with credit to the source. Libraries may photocopy beyond the limits of US copyright law, for private use of patrons, those articles in this volume that carry a code at the bottom of the first page, provided that the per-copy fee indicated in the code is paid through the Copyright Clearance Center, 222 Rosewood Drive, Danvers, MA 01923. Other copying, reprint, or republication requests should be addressed to: IEEE Copyrights Manager, IEEE Service Center, 445 Hoes Lane, P.O. Box 133, Piscataway, NJ 08855-1331. The papers in this book comprise the proceedings of the meeting mentioned on the cover and title page. They reflect the authors' opinions and, in the interests of timely dissemination, are published as presented and without change. Their inclusion in this publication does not necessarily constitute endorsement by the editors, the IEEE Computer Society, or the Institute of Electrical and Electronics Engineers, Inc.
IEEE Computer Society Order Number P2245 ISBN 0-7695-2245-9 ISSN Number 1530-1362 Additional copies may be ordered from: IEEE Computer Society Customer Service Center 10662 Los Vaqueros Circle P.O. Box 3014 Los Alamitos, CA 90720-1314 Tel: + 1-714-821-8380 Fax: + 1-714-821-4641 E-mail: cs.books@computer.org
IEEE Service Center 445 Hoes Lane P.O. Box 1331 Piscataway, NJ 08855-1331 Tel: + 1-732-981-0060 Fax: + 1-732-981-9667 htlp://shop.ieee.org/store/ customer -service@ieee.org
IEEE Computer Society AsialPacific Office Watanabe Bldg., 1-4-2 Minarni-Aoyama Minato-ku, Tokyo 107-0062 JAPAN Tel: + 81-3-3408-3118 Fax: + 81-3-3408-3553 tokyo.ofc@computer.org
Editorial production by Bob Werner Cover art production by Joe Daigle/Studio Productions Printed in the United States of America by The Printing House
"EÂŁ~
COMPUTER SOCIETY
+IEEE