1
IMPROVING ARMY MULTINATIONAL FORCE COMPATIBILITY by Krista M. Owen Magras A Policy Project submitted in partial fulfillment of the requirements for the degree of
Masters of Public Policy (MPP)
University of California, Los Angeles
June 2000
Approved by ___________________________________________________ Policy Project Advisor Program Authorized to Offer Degree _________________________________________________
Date: March 23, 2000
2
Executive Summary IMPROVING ARMY MULTINATIONAL FORCE COMPATIBILITY By Krista M. Owen-Magras Over the past decade, the United States Army’s involvement in operations requiring close participation with partner country militaries increased significantly. Multiple missions, including operation Desert Storm in the Persian Gulf and most recently operations Joint Endeavor and Joint Guardian in Bosnia and Kosovo have continually tested the interoperability of the participating nations. The lessons drawn from these operations demonstrate the need to design methods for enhancing compatibility between nations across the spectrum of mission types and requirements, from combat operations to peacekeeping and peace enforcement. The object of this study is to improve the Army’s efforts (as guided by the Deputy Under Secretary of the Army for International Affairs – DUSA-IA) at enhancing other militaries’ abilities to function as effective coalition partners (Multinational Force Compatiblity-MFC) over the long term. Rather than designing workarounds, this study provides a method for identifying compatibility deficiencies well in advance of any possible operation, so the partner militaries, with the aid of the US, can “fix” the deficiencies. This project focuses specifically on the development of a method involving both qualitative and quantitative analysis, to assist in identifying deficiencies in multinational force compatibility with possible partner countries, across a range of missions and functional areas. The decision support tool developed in this policy analysis is entitled the Military Compatibility Assessment Tool (MCAT). It is designed to assist DUSA-IA in making strategic level multinational force compatibility policy decisions. The tool specifically allows for the identification of a country’s compatibility strengths and deficiencies in critical mission areas.
3
The MCAT produces a hierarchy of states based on their ground forces’ compatibility with the US Army and identifies areas of greatest need and potential bottlenecks in coalition operations with the US. It establishes the country ranking in a relative fashion, allowing for comparisons among the states using the US Army as the standard point of reference. The MCAT allows DUSA-IA to improve MFC within the current system constraints by giving decision-makers a rational basis on which to make good policy choices. It allows the Army to identify which activities, conducted with which partners, will be the most effective in enhancing MFC. Beyond providing a general analysis of the current worldwide situation, the analysis can be tailored to fit any region or mission type so as to provide specific input for contingency planning. Most importantly, the MCAT can assist the Army with developing a guiding strategic policy for MFC. Based on my analysis, I recommend that: •
DUSA-IA clearly detail explicit long-term Army MFC priorities. This is a critical function of the office, and should be the first step towards their efforts at enhancing MFC.
•
DUSA-IA and the Army adopt the MCAT methodology as part of the decision support system used to make decisions about MFC priorities, activities and funding.
•
DUSA-IA organize a panel of experts with recent operations experience to identify the critical assessment areas, so as to operationalize the MCAT for analysis.
•
DUSA-IA develop and push for acceptance of standardized evaluation criteria for those critical MFC activities, at a minimum within the Army and if possible for all US services.
4
For Table of Contents and List of Figures see attached documents entitled “Table of Contents” and “List of Figures”
5
6
7
8
ACKNOWLEDGMENTS Many people in the Department of the Army, the Joint Staff, and the Office of the Secretary of Defense deserve thanks for their time and cooperation. The following individuals were particularly helpful in the course of this research: COL Karl Farris (ret.) at the US Army Peacekeeping Institute, LTC Richard R. Caniglia, at the American, British, Canadian, Australian Armies’Standardization Program (ABCA), and Mr. Jim Stone at the Joint Warfighting Center (JWFC).
9
Acronyms ABCA ACRI
American, British, Canadian and Australian Armies’ Standardization Program African Crisis Response Initiative
AIA AIAP AMC ARRC C4I CAA CIA CinC CM CONUS DCSLOG DoD DUSA-IA FAO HNS HQDA IFOR IMET JCS KFOR LMI MACOM MCAT MDEP MFC MOD MOU NATO O&M OSD PfP PM RDT&E RSI RSO&I SA SAO SATCOM SEATO SFOR SIGINT SINCGARS
Army International Activities Army International Activities Plan Army Materiel Command Allied Command Europe (ACE) Rapid Reaction Corps Command, Control, Communications, Computers and Intelligence Capability Assessment Area Central Intelligence Agency Commander in Chief Compatibility Measure Continental United States Deputy Chief of Staff, Logistics Department of Defense Deputy Under Secretary of the Army, International Affairs Foreign Area Officer Host Nation Support Headquarters, Department of the Army NATO Implementation Force International Military Education and Training Joint Chiefs of Staff NATO Kosovo Force Logistics Management Institute Major Area Command Military Compatibility Assessment Tool Management Decisions Package Multinational Force Compatibility Ministry of Defense Memorandum of Understanding North Atlantic Treaty Organization Operations and Maintenance Office of the Secretary of Defense Partnership for Peace Political-Military Research, Development, Test and Evaluation Rationalization, Standardization, and Interoperability Reception, Staging, Onward movement and Integration Security Assistance Security Assistance Officers Satellite Communications South East Asia Treaty Organization NATO Stabilization Force Signal Intelligence Single Channel Ground and Airborne Radio System
10
SOFA TDA TEPMIS TOE TRADOC TSPS USMA
Status of Forces Agreement Table of Distribution and Allowances Theater Engagement Plan Management Information System Table of Organization and Equipment U.S. Army Training and Doctrine Command Theater Security Planning System United States Military Academy
11
I. INTRODUCTION
Policy Problem Statement Ultimately, we risk diminishing our collective effectiveness, as Allies unwilling to commit sufficient resources become less interoperable with those who make the necessary investment in modern war-fighting technologies. It is not just a matter of incompatible equipment, but, overtime, incompatible doctrine…. --William S. Cohen, U.S. Secretary of Defense (8 Feb 98) Over the past decade, the Army’s involvement in operations requiring close participation with partner country militaries increased significantly. Multiple missions, including operation Desert Storm in the Persian Gulf and most recently Joint Endeavor and Joint Guardian operations in Bosnia and Kosovo have continually tested the interoperability of the participating nations. The lessons drawn from these operations demonstrate the need to design methods for enhancing compatibility between nations across the scope of mission types and requirements, from combat operations to peacekeeping and peace enforcement. During Desert Storm compatibility issues were raised in C4I, doctrine, technology and procedures. Coalition military planners used a variety of “workarounds” to overcome the initial lack of compatibility between forces. Workarounds are mitigating measures which reduce an incompatibility’s effects. They can be compared to “fixes” which reduce incompatibility at its roots. The US used liaison teams to train Saudi forces and to augment Saudi command and control assets. The US also mitigated incompatibilities in technology by loaning equipment to the coalition members. Five ground-mobile force/defense satellite communications systems were transferred to British units in order to address command and control (C2) limitations between Britain’s command headquarters and British forces on the ground. The US further
12
addressed technological compatibility deficiencies by assuming the burden of responsibility for some coalition capabilities. For example the US provided the majority of the coalitions command, control, communications, computers and intelligence (C4I) including most of the intelligence collection through its satellite systems. All of these methods for addressing compatibility deficiencies represent short-term compatibility fixes designed to gain the minimum compatibility required to successfully complete the mission. While they were effective for the short duration of this particular operation, these “work arounds” may not have been as successful had the coalition been tested by high intensity conflict for a longer period of time. 1 The object of this study is to improve the Army’s efforts at enhancing other militaries’ abilities to function as effective coalition partners over the long term. Rather than designing workarounds, this study is designed to provide a method for identifying compatibility deficiencies well in advance of any possible operation so the partner militaries, with the aid of the US, can “fix” the deficiency. The likelihood of addressing all of the possible compatibility problems is minimal and there will always be a need for creative “work arounds” prior to multinational operations. However, identifying compatibility shortfalls with possible partners well in advance of a multinational operation will achieve several significant goals. First, it will reduce the dependency on innovative thinking by coalition planners to devise workarounds and leave them more time to concentrate on mission planning and execution. Second, it will increase the likelihood of success for coalition operations under increasing levels of intensity and duration. Finally, it will address the concern of Defense Secretary William Cohen by placing the emphasis of US military relations with partner countries on enhancing compatibility and therefore maintaining our collective effectiveness.
1
See Michele Zanini and Jennifer Morrison Taw, Force XXI: Implications for Multi-Force Compatibility, RAND DRR-1961-A, September 1998.
13
As the Army participated in an increasing number of multinational operations, its leadership realized the need for an agency that could concentrate on the Army’s interactions with partner countries. In 1995, the Secretary of the Army created the DUSA-IA office (Deputy Undersecretary of the Army for International Affairs), and gave it responsibility for planning, coordinating, and facilitating the Army’s international activities. While DUSA-IA continues to spend significant effort on establishing a credible and successful Army International program, it has had difficulties in reaching its goals, including those with regards to multinational force compatibility (MFC). While DUSA-IA has the responsibility for the Army MFC program, currently, the institutional Army continues to function without an explicit long-term plan for its MFC efforts. DUSA-IA is seeking methods to improve its capabilities for managing the Army’s international efforts at enhancing MFC. As a new office, it faces two fundamental, systemic challenges: (1) lack of institutional power; and (2) an inability to control any significant portion of the Army’s international activities (AIA) budget. The lack of institutional power stems from the lack of institutional experience of the agency. In existence only five years, the agency has not created the web of contacts or established its area of control within the defense planning system. The second problem, the lack of fiscal control, results from the decentralization of control of funding for the wide variety of international activities and specifically identifying those activities dedicated to MFC enhancement. Identifying Army resources devoted to AIA is difficult for many reasons. The broadest reason is that international activities are pervasive throughout Army operations. Although the integration of the Army’s international activities into its everyday tasks benefits and furthers the Army’s strategy of engagement, it makes it difficult to identify and isolate those aspects of Army operations that constitute IA. Further complicating
14
the problem is the issue that not all international activities directly benefit MFC. For example, the Army’s efforts in implementing the INF treaty are obviously an international activity, but it would be difficult to identify the benefits to MFC that would result directly from this activity. Another issue which makes it difficult to isolate the specific MFC resources, is that the Army’s international activities incur indirect costs that are very difficult to capture. For example, the costs associated with Army personnel traveling to attend bilateral and multilateral meetings overseas can be identified as direct costs of IA and also MFC. However, unless the individual is employed with a unit dedicated to IA, the time they spend on the trip –that portion of their salary—is an indirect cost associated with the meeting that cannot be easily captured. Thus, funding is spread across Congressional appropriations accounts and throughout the Army’s internal budget management structure. This diffusion makes it difficult to identify and to manage resources allocated to IA. 2 This study does not attempt to solve these two issues, which while critical, do not appear to have any immediate or feasible solutions. Rather, I attempt to provide DUSA-IA with an alternative method for improving the effectiveness of Army MFC activities within the current constraints. While DUSA-IA may not be able to demand control of all of the activities and their funding, they most certainly can provide clear and meaningful strategic policy guidance for Army MFC, and provide suggestions and recommendations to the JCS (Joint Chiefs of Staff), the individual CINCs (Commander in Charge), and the Army Staff reference the Army focus for MFC activities. This project stems from my work as a member of a project team at the Arroyo Center, RAND, Santa Monica CA. This Applied Policy Project includes and builds upon my research
2
Thomas S. Szaynz, Frances M. Lussier, Krista Magras, Olga Oliker, Michele Zanini, and Robert Howe, Improving Army Planning for Future Multinational Coalition Operations, RAND DRR-2174-A, Sept 1999. pp.10-11.
15
and written contributions to the RAND project entitled Improving Planning for Future Multinational Coalition Operations.
Scope and Focus of the Policy Project This analysis presents a method for enhancing MFC that can be easily and immediately incorporated into the current DUSA-IA decision-making process. There are a wide variety of factors which affect the ability of partner countries to successfully operate together that go well beyond simple military capabilities, including political systems, historical backgrounds, common experiences, and societal makeup. However, these factors are not only difficult to identify but almost impossible to control or “fix�. This study focuses on what can be fixed by identifying deficiencies in military compatibility. In the course of developing a quantitative method for assisting the Army with enhancing MFC efforts, this analysis provides an in-depth evaluation of current strategic guidance on multinational force compatibility and a basic assessment of current multinational force compatibility enhancing Army activities and how they are evaluated. As this project is oriented toward the International Activities of the Army, it focuses on how to improve MFC as a function of improving the military capabilities of our friends and allies; it does not consider unilateral improvements to US forces designed to enhance MFC. Additionally, this project does not address issues such as host-nation support, non-governmental organizations or international organizations, all of which are important to coalition operations but beyond the inquiry of this project. MFC involves a wide range of activities that can be enhanced in a variety of methods. Each individual program manager will have ideas about how and which MFC enhancing efforts should be implemented for their specific area. Focusing on the individual areas requires tailored and sustained solutions and DUSA-IA does not need to micromanage the Army’s international
16
activities by providing individual solutions to every MFC problem. Rather, DUSA-IA is responsible for providing MFC guidance, oversight and continuity to the individual program managers and unit leaders. To aid DUSA-IA in this goal, this project focuses specifically on the development of a method involving both qualitative and quantitative analysis, to assist DUSA-IA in identifying deficiencies in multinational force compatibility with possible partner countries across a range of missions and functional areas. The decision support tool resulting from this policy analysis is entitled the Military Compatibility Assessment Tool or MCAT and it is designed to assist DUSA-IA in making strategic level multinational force compatibility policy decisions. The tool specifically allows for the identification of a country’s compatibility deficiencies and strengths in critical mission areas. The MCAT produces a hierarchy of states based on their ground forces compatibility with the US Army and identifies areas of greatest need and potential bottlenecks in coalition operations with the US. It does so in a relative fashion, allowing for comparisons among the states while judging all against the US as the standard point of reference. The MCAT allows DUSA-IA to improve MFC within the current system constraints by giving decision-makers a more reasoned basis on which to make good policy choices.
Policy Project Contents Chapter II provides background and analysis of the basic problem of “How can the Army improve its multinational force compatibility efforts?� It describes the objectives and method for approaching this problem, providing detailed information on both the current strategic multinational force compatibility guidance and Army activities and their evaluation.
17
Chapter III details the development of the Military Compatibility Assessment Tool, providing a description of the methodology, structure and capabilities of the decision support system. Chapter III also provides analysis on how to operationalize the MCAT for analysis. Chapter IV presents the conclusions of the study including specific policy recommendations for DUSA-IA and details the policy implications of the study. Of critical value is the identification of the obstacles which DUSA-IA may face in implementing policies designed to enhance multinational force compatibility, and methods for overcoming those obstacles. This chapter also identifies steps for further enhancement of multinational force compatibility.
18
II. POLICY ANALYSIS 1. Introduction The fall of the Berlin Wall in 1989 signaled a change in the world system, an end to the bipolarity of the Cold War and the commencement of U.S. hegemony. In response to this new international system, the President’s national security strategy, A National Security Strategy for a New Century, stresses an integrated approach to creating a peaceful 21st Century global system where the United States reaches out to other nations to create an infrastructure of institutions and arrangements. This strategy directs a shift in U.S. military requirements. While direct guidance to the military forces includes first and foremost maintaining the ability to act alone, the president acknowledges the need for enhanced coalition capabilities, and directs the U.S. military forces to strengthen alliances and coalitions, “to meet the challenges of an evolving security environment.” 3 The recent increase in U.S. participation in multinational operations stands as testimony to the need for enhancing multinational force compatibility (MFC). In the ten years since the collapse of the Soviet Union, the U.S. Army participated in 33 major deployments, compared with 10 in the previous 44 years. For the majority of these deployments, American ground forces worked with other nations’ ground forces to accomplish mission objectives.
1.1 The Problem The object of this study is to improve the Army’s efforts at enhancing other militaries’ abilities to function as effective coalition partners. The best method for advancing Army MFC
3
A National Security Strategy for a New Century, The White House, October 1998. p.13.
19
efforts would be to conduct an analysis of current Army MFC efforts along the lines of effectiveness and efficiency and then submit recommendations for improvement. However, two major shortfalls in the current Army international activities program hamper such an analysis. The first is a severe problem in identifying resources utilized for MFC efforts (inputs) and the second is the lack of a clear set of institutional goals for MFC (outputs) and a standardized method for their evaluation. In recognition of the need for greater coordination of the Army’s international activities, the Army created the office of the Deputy Under-Secretary of the Army for International Affairs (DUSA-IA). DUSA-IA is charged with planning, coordinating and facilitating the Army’s international activities. 4 However, insufficient adaptation of other parts of the Army to the creation of DUSA-IA has hampered the effective functioning of the office. The primary shortfalls lie within the Army budget, which does not portray the resources dedicated to international activities in any consistent manner. DUSA-IA controls few of the funds directly and this lack of clear, detailed knowledge reduces visibility of the Army’s efforts in intra-Army planning, programming and budgeting process. The lack of visibility also hampers efforts to conduct an effective evaluation of the Army’s current MFC efforts. The inability to calculate the cost-effectiveness of MFC efforts makes it difficult to present the potential benefits in terms of resources and it makes it nearly impossible to accurately assess trade-offs in MFC efforts and to choose MFC efforts accurately. The second problem lies in the identification of outputs from MFC activities. The most crucial shortfall in the identification of outputs of MFC efforts is the lack of a set of identifiable Army-level institutional goals for MFC. Furthermore, while measures of levels of compatibility
20
of allied and partner militaries in crucial categories are available, they are currently not organized in an easily attainable form. Further complicating the issue is the lack of standardized Army-level metrics or criteria for the evaluation of MFC activities. Evaluation of effectiveness takes place at the level of individual programs or exercises with metrics and evaluation criteria varying depending on the Army command or agency involved. There is currently no coordinated set of measures used to evaluate a partner country’s compatibility in the critical mission areas. This is an indicator of the highly decentralized nature of the entire MFC planning and implementation process. The above outlined issues demonstrate the difficulty in conducting an analysis of existing Army MFC activities based either on effectiveness or efficiency. The current situation makes evaluation of the current shortfalls and presentation of a list of recommendations for improvement difficult. However, the following analysis demonstrates how the Army can enhance its MFC efforts by developing an MFC blueprint. 5
1.2 Objectives, Approach and Organization In an overall sense, this study seeks to help the U.S Army improve its multinational force compatibility efforts. In doing so, some of the critical questions are: How can the Army prioritize among the variety of allies and partners in terms of coalition enhancement activities? What are the most pressing problems in other armed forces to more effective coalition operations? How can DUSA-IA improve its ability to guide the strategic policy making decisions with respect to multinational force compatibility? Which regions of the world contain
4
The responsibilities of DUSA-IA were codified in 1997; General Orders No. 10, Assignment of Functions, Responsibilities, and Duties within the Army Secretariat, Headquarters, Department of the Army, Washington, D.C., August 12, 1997. 5 Another member of the team, Frances Lussier, provided the budget analysis in the final RAND report, Improving Army Planning for Future Multinational Coalition Operations.
21
partner countries with enhanced compatibilities for a particular type of operation and which regions pose serious problems for creating a coalition of compatible forces? In order to answer these questions, I conducted a comprehensive study of the existing state of affairs regarding the Army’s formulation and implementation of multinational force compatibility policy. I reviewed the current strategic guidance to the Army on multinational force compatibility. I then examined a sample of the existing critical Army MFC activities and how they are evaluated for effectiveness. The results of these investigations demonstrated two key factors that help explain the current lack of a clear Army MFC policy. First, the review of the current guidance demonstrated that there is direct and clear strategic guidance from the President and the Chairman of the Joint Staff to the Army on the importance of MFC. The strategic guidance is broad based and leaves much of the execution at the discretion of the individual services. The individual service discretion in this policy area allows the Army to focus their efforts on enhancing MFC in the most effective and efficient manner for the Army. However, while the service has taken action by creating the office of DUSA-IA to develop MFC policy, there is no current strategic Army policy for MFC and this significantly hampers the service’s efforts at focusing, guiding and evaluating MFC enhancement activities. Second, the review of current army evaluation practices for MFC activities showed that Army MFC activities are highly decentralized and while they may be effective on an individual basis, they are not necessarily coordinated to achieve any Army level MFC policy. In order for DUSA-IA to have an effect, they must design and implement MFC policy which can be executed in a decentralized manner (keeping with current Army doctrine) but which is focused on a centralized strategic plan for Army MFC.
22
The second portion of the study provides a method combining qualitative and quantitative analysis to assist DUSA-IA in designing an Army level MFC policy. Adapting a RANDdeveloped decision support system (DYNA RANK) 6, I created the Military Compatibility Assessment Tool (MCAT), which makes it possible to identify the main compatibility problems and match the appropriate policies and programs to ameliorate those problems. RESEARCH SOURCES The initial assessment of current MFC policy and activities is based on a review of relevant documents and regulations. I also interviewed numerous Army and Department of Defense (DoD) personnel working with MFC policy and on MFC issues. In addition, I conducted a literature search for other analytical efforts on the topic of MFC. The MCAT stems from recent operations research work at RAND.
2. Strategic Guidance on Multinational Force Compatibility This section provides an overview of the current strategic MFC guidance. The Army receives guidance on Multinational Force Compatibility (MFC) from a variety of sources. The strategic guidance in the National Security Strategy and the National Military Strategy is broad and wide sweeping. The Joint Training Policy and the Joint Training Master Plan provide more specific, pointed guidance reference conducting coalition operations. Further detailed and region specific guidance is provided to the CINCs and Service Chiefs through the Joint Strategic Capabilities Plan (JSCP), and the Defense Planning Guidance (DPG). The current strategic documents reflect the major changes of the world security system over the past decade and address the new challenges expected in the 21st century. Barring any
6
See Richard J. Hillestad and Paul K. Davis, Resource Allocation for the New Defense Strategy, The DynaRank Decision-Support System, RAND 1998.
23
major world system changes, new administrative guidance is likely to change only slightly with each new administration, reflecting the character of the new leadership but not changing the focus of the militaries efforts at enhancing MFC. The US military maintains a centralized planning structure and a decentralized execution structure. While the leaders provide guidance, the day-to-day operations are very much decentralized. The strategic MFC guidance the upper echelons of the US military provide to the individual services is designed to function in the decentralized manner. While providing a guiding structure, it is broad based and leaves the details of the execution to the discretion of the individual services. The decentralization of execution in MFC activities allows the Army to focus their efforts on enhancing MFC in the most effective and efficient manner for the Army.
2.1 Strategic Guidance Documents National Security Strategy The highest level of guidance comes from A National Security Strategy for a New Century 7, which outlines the President’s priorities for national defense. The President emphasizes the role of the United States as one of “engagement”. As “the most powerful force for peace, prosperity and the universal values of democracy and freedom”, the United States’ challenge is to “sustain the role, by harnessing the forces of global integration.” 8 The President stresses an integrated approach to creating a peaceful 21st Century global system, where the United States reaches out to other nations to create an infrastructure of institutions and arrangements.
7 8
A National Security Strategy for a New Century, The White House, October 1998. Ibid, p.iii.
24
Direct guidance to the military forces includes first and foremost maintaining the ability to act alone. However, additional guidance stresses the critical need for creating lasting relationships with US friends and allies. US military forces build coalitions and shape the international environment in ways that protect and promote US interests. The National Security Strategy directs the US military forces to strengthen alliances and coalitions, “to meet the challenges of an evolving security environment.” 9 The strategic guidance, while not specifically referencing MFC, stresses the importance of multinational compatibility through its references to the expansion of military alliances, such as NATO and increased military to military activities including defense cooperation, security assistance, international training and exercises. The National Security Strategy references both the Partnership for Peace and the President’s Warsaw Initiative as successful examples of how the US is using military forces to strengthen relationships with allies and friends. These programs increase the interoperability between NATO and the militaries of Central and Eastern Europe. An additional program referenced in the guidance is the African Crisis Response Initiative (ACRI) that entails training African militaries how to conduct peacekeeping and humanitarian missions.
National Military Strategy A second document providing strategic guidance is the National Military Strategy (NMS). The NMS is formulated from the National Security Strategy and the Quadrennial Defense Review (QDR). This document “provides advice from the Chairman of the Joint Chiefs of Staff (CJCS) in consultation with the Joint Chiefs of Staff (JCS) and the combatant
9
Ibid, p.13.
25
commanders on the strategic direction of the Armed Forces over the next three to five years” 10. Not surprisingly, the guidance also stresses the imperatives of “engagement”, emphasizing the Armed Forces’ role of interacting with militaries from other nations as a critical component of the National Security Strategy. Under the strategy of “Shape, Respond, Prepare Now: A Military Strategy for a New Era” 11, the CJCS stresses the importance of increasing compatibility with our allies and friends by including multinational force operations as the one of his top three considerations for guiding the employment of military forces. “While retaining unilateral capability, whenever possible we must seek to operate alongside alliance or coalition forces, integrating their capabilities and capitalizing on their strengths.” 12 Thus implied, is the need for US forces to train and prepare for multinational operations. The NMS provides guidance to the Armed Forces on increasing MFC through engagement activities. The Armed forces are used to “Shape” the international environment which means utilizing the military to engage other countries in creating effective relationships so as to enhance the regional or systemic security. One of the key components is the international exercise program. International exercises not only enhance interoperability and readiness but also demonstrate the United States’ ability to form and lead effective coalitions. The NMS details additional benefits of participating in multinational exercises as opportunities for the US forces to demonstrate their capabilities, train under realistic conditions, gain geographic familiarity and develop an understanding of cultures, values and habits of other societies.
10
National Military Strategy, Chairman of the Joint Chiefs of Staff, FY97, p.1. See also http://www.dtic.mil/jcs/core/nms.html 11 Ibid. p.1. 12 Ibid. p.10.
26
Further activities, which the NMS outlines for military engagement, include information sharing, foreign military sales and the International Military Education and Training (IMET) program. The NMS, while recognizing the need for unilateral action, further stresses the requirements for enhancing MFC across the entire range of military actions. The Armed forces must be prepared to “Respond” to the full range of possible military operations, from humanitarian and peacekeeping missions to major theater wars. The NMS requires US forces prepare to respond unilaterally to any or all of these mission types. However, the CJCS also recognizes the need to maintain interoperability with allies and potential coalition partners. “Although we must retain the capability to act unilaterally, we prefer to act in concert with our friends and allies. Laying a solid foundation for interoperability with our alliance and potential coalition partners is fundamental to effective combined operations. We remain committed to doctrinal and technological development with our key allies and to combined training events and exercises that contribute to interoperability.” 13 The National Security Strategy and the National Military Strategy both provide broad guidance on the subject of MFC. While they do not outline specific compatibility requirements for the military forces, they require US forces to participate in military to military interactions that will “build constructive security relationships,..and enhance stability.” 14 Both documents cite the key role of the military in shaping the international environment through overseas presence and engagement activities, with specific reference to multinational training events such as the Partnership for Peace exercises. Both of these documents present clear guidance, dictating that US forces must commit their efforts to training for missions in concert with other allies and potential coalition partners across the spectrum of military operations.
13 14
Ibid, p.16 Ibid, p.10
27
2.2 The Joint Staff The Joint Staff, operating under the guidance of the CJCS, assists the CINCs in implementing the national military strategy. The staff is responsible for all joint matters including strategic planning and contingency planning of the integrated employment of land, sea and air forces. The Joint Staff produces the initial guidance to the CINCs and Service Chiefs on MFC requirements and assists in implementation and evaluation of compatibility measures. The J7 (Operational Plans and Interoperability) is the responsible staff section for interoperability issues and will be discussed in detail at the end of this section. Even though the J7 has the lead on MFC, many of the Joint Staff Directorates provide guidance, implementation and evaluation for MFC within their areas of responsibility. 15
J3 - Operations The J3 is in charge of current plans and operations. They are responsible for ongoing regional operations, including the monitoring and coordinating support for current UN operations and developing and publishing the operational Rules of Engagement (ROE). With regards to MFC specifically, the J3 ensures that current plans and operations are consistent with national standing rules and other applicable guidance.
15
The following analysis does not provide a full account of each staff’s responsibility, but rather details a few specific areas where they influence MFC.
28
J4 - Logistics The J4 has a specific division dedicated to multinational logistics support. Within the division, the staff is developing improved logistics procedures, programs and planning in multinational operations. The goals are focused on four specific areas, outlining methods for US forces to utilize in improving compatibility and interoperability:
•
Establish a Framework for US involvement in multinational operations
-Joint Publication 4-08, Logistics Support for Multinational Operations is under development. The J4 is directing additional efforts toward providing guidance to unified commanders and service chiefs on the critical logistic functions required for successful multinational operations, including the necessity to exercise the logistics mechanisms in multinational training •
Expand bilateral agreements
- Increase the content and number of Acquisition and Cross-Service Agreements (ACSAs) and Host Nation Support (HNS) Agreements, both of which promote interoperability by increasing the host nation “by in” to the international operation and strengthen defense cooperation between the forces. •
Leverage multinational capabilities
- Improve the logistics capabilities of multinational organizations such as the UN and NATO through the following programs: assigning U.S. liaison logistics officers to the UN Department of Peace-Keeping Operations, supporting NATO initiatives such as the Multinational Joint Logistics Center and the Pacific Area Senior Officers Logistics Seminal (PASOLS) education initiative.
29
•
Share technology to promote interoperability -The directorate is in the process of “capturing existing business practices” on logistics in the international arena. They will then use this list to evaluate returns to asset investments and develop a priority list for the best logistics mechanisms to increase interoperability. 16
J5 - Strategy and Policy The J5 has the responsibility for strategic plans and policy. They publish many of the strategic documents guiding the CINCs and Service Chiefs in their implementation of MFC. The J5 monitors and reviews the strategic environment to ensure that future security planning reflects real world trends and issues. They are responsible for publishing the Joint Strategy Review, which recommends changes and additions to the current NMS and also recommends future alternative strategies. The directorate is also responsible for publishing the CJCS’s long-range vision, Joint Vision 2010. The J5 not only provides long-range guidance but also provides more specific short-term guidance, through the Joint Strategic Capabilities Plan (JSCP). The JSCP provides specific guidance from the CJCS to the CINCs, apportioning forces and detailing the types of contingencies to plan for and priorities for training the forces within each area of responsibility.
16
The J4 mentioned the use of the DOS/OSD Enhanced International Peacekeeping Capability (EIPC) prioritization to identify best candidates for integration efforts. Following identification will determine whether to promote US systems internationally, or adapt to allies’ systems on a case-by-case basis.
30
J7 - Operational Plans and Interoperability The J7 mission is to assist the CINCs in: 1) their preparation for joint and multinational operations, 2) the conceptualization, development, and assessment of current and future joint doctrine and 3) the accomplishment of joint and multinational training and exercises. While the primary focus is on joint interoperability, there is mention of combined interoperability in the missions and publications of the J7 directorates, which include the Joint War Plans Division, the Joint Doctrine Division, the Joint Training and Exercise Division, the Joint War Fighting Center and the Evaluation and Analysis Division, which includes the Joint Center for Lessons Learned.
Joint Doctrine Division The Joint Doctrine Division is responsible for developing doctrine on multinational operations. Doctrine development occurs in a 5-step process including both the JCS and the services as integral players. The first step is the proposal of a project from one of the services, CINCs or the Joint Staff. The J7 validates the requirement and initiates the program directive. Step two includes staffing the project with the Services and CINCs and releasing the project to a Program Directive Lead. The draft publication is developed by the lead in Step 3 of the process. Step 4 is JCS staffing of the draft and formal CJCS approval of the publication. The final step is the assessment and revision of the publication. The combatant commanders and services utilize the publication for 18 to 24 months and then provide the J7 with a written evaluation of the publication. Revisions to the publication are to be made within 5 years of initial publication.
31
Joint Training and Exercise Division The Joint Training and Exercise Division is responsible for publishing training guidance and managing the coordination of major training exercises. The following section covers some of the key documents providing guidance on MFC issues.
Joint Training Policy (CJCSI 3500.01A) The Joint Training Policy for the Armed forces of the United States establishes CJCS policy for planning and conducting joint and multinational training. While providing more specific guidance than the NMS, most of the applications are from a short term rather than a long-term perspective. The criticality of MFC is established in the opening letter from the Chairman of the Joint Chiefs of Staff, where he includes multinational operations in his list of the top three priorities for military training:
1. Prepare for WAR 2. Prepare for MOOTW-- Prioritized by each combatant commander 3. Prepare for Multinational/Interagency Operations.
Section III of the document provides general guidance for MFC training. “Commanders and staffs of the forces or components that are likely to be employed in multinational operations should be exercised regularly during peacetime with their multinational partners.� 17 The section also lists the key tenets for Multinational Joint Training.
32
Key Tenets of Multinational Joint Training
• Focus on critical requirements and vulnerabilities • Increase interoperability through exploration of various partner capabilities • Emphasize focused logistics and command and control The joint training policy provides further guidance on Multinational Joint Training by acknowledging that reduced force structure and limited training resources may prevent large scale training exercises and recommends “smaller, more focused, talk-based training events” as a solution. The document also recommends developing a common frame of reference for operating in a multinational environment. This common standard can be developed in the form of standardization agreements (STANAGs), where mutually acceptable tasks, conditions and standards are agreed upon. Another method is for commanders to utilize the Universal Joint Task List (UJTL) published through the J7. The UJTL is further referenced in the joint training master plan. Joint Training Master Plan 2000 (CJCSI 3500.02B) The joint training master plan provides guidance to the combatant commands and Services for planning and conducting joint training exercises. This document further stresses the use of the UJTL during international exercises. “In multinational training, there are clear advantages in communicating missions, training programs and exercise objectives to all participants through the common language of a requirements-based training system.” 18 Therefore the JCS recommend sharing and educating other forces on the UJTL to facilitate its use during multinational operations.
17
Joint Training Policy, CJCSI 3500.01A, 1 July 1997.
33
The joint training master plan also recommends consolidating exercises due to “OPTEMPO and PERSTEMPO commitments, budgetary constraints and manpower reductions.” 19 The training master plan recommends asset conserving moves such as combining multiple CINC sponsored exercises together and giving priority to exercises which emphasize the use of technology: for example a computer generated command post exercise, over exercises which require large scale troop movements.
Joint Doctrine for Multinational Operations (Joint Pub 3-16) Additionally, the Joint Staff is producing a publication on Joint doctrine for Multinational Operations. However, this publication is currently under construction and contains only a few pages which are limited in scope and incomplete in discussion.
The Evaluation and Analysis Division The Evaluation and Analysis Division of the J7 runs the Joint Center for Lessons Learned. The Center captures the comments of the participants of joint and combined exercises in after action reports (AARs) and consolidates them on a database. All levels of command are invited to provide input using the Joint Universal Lessons Learned System. Additionally, the CJCS’s Remedial Action Project Program provides a follow up mechanism, ensuring that the lessons learned are utilized in future exercises. The division is attempting to improve this system through the Better Lessons Learned campaign which is aimed at helping commanders to capture, act on and share lessons learned with one another. The Oversight Program for Joint Task Forces is a separate initiative aimed at improving joint and multinational exercises. This evolving
18 19
CJCS Joint Training Master Plan 2000 for the Armed Forces of the United States, May 1998. Ibid.
34
project is designed to review JTF organization, manning and training to ensure consistency with existing doctrine. 20 In review, the J7 affects MFC through doctrine development, and training policy and implementation. The directorate has mechanisms for reviewing not only whether the doctrine is being adhered to but also if the doctrine is consistent with current operational requirements or requires adjustments to meet the needs of the combatant commanders.
2.3 Detailed Guidance Further detailed and region specific guidance is provided to the CINCs and Service Chiefs through the Joint Strategic Capabilities Plan (JSCP), and the Defense Planning Guidance (DPG). Both documents are critical to the implementation of MFC activities. These classified documents provide specific details to the CINCs including identifying and providing sourcing of US combat forces, allocating major combat forces and strategic lift among Unified Commanders and providing strategic planning assumptions and directions for development of operation plans (OPLANS) by unified commanders. These documents drive the training of the forces. If MFC activities are not part of the JSCP planning directive guidance, then CINCs are not likely to direct their forces to spend time training on them.
2.4 Conclusions The National Security strategy and the National Military Strategy clearly state that US military forces must be prepared to act in conjunction with allies and other possible coalition forces. Through broad sweeping strategic guidance these documents place emphasis on creating
20
Comments from this section are based on readings from the J7 programs descriptions, and not on working
35
working relationships with other nations’ military forces and increasing interoperability through international training exercises and combined doctrinal and technological development. The Joint Staff provides more in-depth guidance through joint publications and coordination with the CINCs and Services. The different staff sections provide guidance within their areas of responsibility. Key MFC guidance is detailed in the responsibilities of the J4, which provides guidance for improving multinational logistics, the J5, which publishes the JSCP, and the J7 which provides training policy, develops the UJTL and coordinates training exercises through the Joint Training Master Plan. The J7 also provides mechanisms for reviewing doctrine and compiling lessons learned from international exercises. The most detailed strategic MFC guidance is found in classified documents, specifically the JSCP and the DPG. These documents are critical to the implementation of MFC activities. If the JSCP planning directive guidance does not direct a CINC to focus on MFC activities, they are unlikely to receive any attention. The majority of the strategic guidance is extremely clear in its directive to improve operational capabilities with partner countries. However, both the Presidential and Joint Chief guidance leave the CINCs and Service Chiefs ample room to maneuver in achieving the goal of enhance MFC. The JCS publications concentrate on strategic guidance leaving the operational and tactical MFC development, coordination and implementation to the individual services and CINCs. This decentralized policy is positive in that it allows the individual combatant commanders and services a great deal of flexibility in developing MFC policy with in their area of responsibility. However, the current lack of more specific guidance or of a guiding plan at the service level, may lead to a variety of uncoordinated MFC actions within each theater and within the Army.
knowledge of actual occurring processes.
36
3. Army Evaluation of Multinational Force Compatibility A critical component of a strategy to enhance Army MFC is a method for evaluation. Once the Army has identified MFC objectives/activities it should develop a method for comprehensive assessment/evaluation of MFC activities. Standardized assessment can be used to measure an MFC activity’s effectiveness and in conjunction with a cost benefit analysis determine whether the activity should be continued. Assessment also provides a method for prioritization of MFC activities. Given the varied and decentralized nature of Army international activities, MFC evaluation and prioritization processes occur at different levels. An analysis of MFC efforts can focus on an individual program, on a portfolio of programs managed by a particular Army agency, or at a broader Army—and interagency—level. The level of analysis clearly influences the types of issues that are explored in the evaluation process. For instance, program evaluation is mostly concerned with progress on specific tasks agreed upon by program participants, while an agency-wide assessment of MFC activities may explore whether a series of programs in the Agency’s area of expertise is addressing the appropriate issues with the appropriate participants. It is only at the Army-wide level that one can understand whether the combination of Army MFC programs supports a coherent and effective MFC strategy. The following investigation of how the Army evaluates MFC activities provides evidence of the Army’s decentralization of international activities. Program managers and unit commanders are setting MFC goals and conducting assessments of MFC activities, yet their efforts are not necessarily reflecting broader Army MFC objectives. While these activities may be effective in enhancing MFC, there is no Army level evaluation to determine if these various MFC activities are the most effective and efficient methods for achieving Army MFC goals. 37
This following section provides an overview of Army MFC objectives and activities and how they are evaluated
3.1 Army Regulations on evaluation of MFC activities Draft AR 34-1 outlines DA policy on MFC. “Specific objectives will be established for all MFC activities in which the Army participates.” However, there has been no clear delineation of what constitutes an Army MFC activity or an MFC objective. Due to the decentralized nature of international activities, a standardized list encompassing all of the Army’s MFC activities or objectives does not exist. However, a search of current publications found multinational objectives incorporated in joint publications including the CJCS Joint Training Master Plan and the Universal Joint Task List; and Army publications, including The Army Plan. Further objectives are outlined through multinational agreements including STANAGs, QSTAGs, and PfP training requirements. Draft AR 34-1 says “The Army shall assess, on an annual basis, the adequacy, effectiveness, efficiency, and productivity of MFC activities with respect to Army MFC objectives.” 21 The regulation further assigns DUSA (IA) with responsibility to, “establish procedures and objective criteria for measuring the effectiveness of Army MFC activities and prepare an annual assessment of MFC activities.” 22 The agencies in charge of implementing MFC programs also need to “evaluate the effectiveness of MFC activities for which the QDA staff element has assumed responsibility, in accordance with guidance provided by DUSA(IA).” 23
21
AR 34-1 (Draft), Dec 98, p.21. Ibid., p.23. 23 Ibid., p.28. 22
38
Draft AR 34-1 clearly states the requirements for centralized standards of evaluation for MFC. However, in practice, MFC evaluation is not being conducted according to any centralized standards. MFC activities are evaluated on an ad hoc basis, driven by the Army agencies or the chain of command in charge of the operation. Agencies often devise specific evaluation procedures for their MFC activity and commanders determine performance measures based on their METL tasks and training objectives. Individuals at the lowest level are setting MFC objectives and priorities and evaluating MFC activities. However, these evaluations do not appear to rely on standardized criteria or metrics provided by DUSA (IA) and are not reflective of a broader Army strategy for MFC. The current Army International Activities Plan lists over 150 different activities, all of which are executed and evaluated by different organizations using the locally determined standards. Adding to the problem of evaluation of MFC activities is the fact that very few, if any of the lessons learned from these activities are forwarded to a centralized activity, ie. The Center for Army Lessons Learned (CALL), for consolidation, analysis and dissemination. “Unit commanders may believe that the lessons learned in After-Action Reports and Quarterly Training Briefs are somehow flowing through the chain of command to arrive at CALL for subsequent dissemination; however, this is not the case. The number of After-Action Reports and lessons learned reports submitted to CALL voluntarily over the course of 1998 could be counted on one hand.� 24
39
3.2 Evaluation of MFC Activities In order to further investigate the evaluation of MFC activities and efforts, the following sections will discuss MFC in the areas of doctrine, operations and training, and C4I, materiel and technology. These areas were chosen specifically for their critical value to enhancing compatibility and interoperability between ground forces. Analysis of past multinational operations demonstrates that these are the areas where incompatibilities are most likely to lead to operational problems and hamper mission success. Doctrine is the basis for how a military operates. It provides the guidelines for all military activities. Should two nations operate on different doctrines (US v. Soviet doctrine) they will not be able to anticipate or understand one anothers’ actions. Operations and training are the background for military actions. They provide the soldiers of each military with experience and determine how they execute a mission. Discrepancies in operations and training methods between militaries can lead to misunderstandings and an inability to work together towards a common goal. C4I is command, control, communication, computers and intelligence. This area is probably the most critical for compatibility. Mission success is likely to be hampered if the command structure or intelligence structure are not clear or widely accepted, or if the leadership can not communicate with one other and effectively control the actions of the military units. The Gulf War provided several examples of how doctrinal, operations and training, C4I and material and technology incompatibilities caused operational problems and hampered mission success. While many of the problems could be addressed with short term workarounds including pre-deployment training and equipment loans or exchanges, not all of the problems could be mitigated. Military planners were forced to use measures such as geographical
24
See Major Arthur Tulak, Center for Army Lessons Learned, Collecting and Sharing Lessons Learned in the Total
40
separation or US provision of the entire capability (i.e., communications satellites) to overcome incompatibilities. Below are some of the highlighted compatibility problems faced during Operation Desert Storm. Non-NATO coalition partners were less compatible than French and British units; for instance, the C4I systems deployed by Arab partners were not sophisticated and had to be supplemented by equipment directly provided by the U.S. Arab coalition members were not prepared –from an organizational and training standpoint--to fight a maneuver war with the high combat tempos characteristic of the Airland Battle. Equipment diversity in Arab arsenals (with systems of varying ages originating from different countries) was a source of logistics problems, since it placed great pressure on spares and maintenance.25
NATO members were also affected by incompatibilities. The French forces, which are a part of NATO but not a part of NATO’s military structure, had not exercised with US units intensively enough to be able to use American battle management systems. The lack of night vision equipment in most French vehicles impeded their full employment at night or under unfavorable weather conditions. The French divisions also lacked the trained intelligence personnel to adequately carry out the Intelligence Preparation of the Battlefield (IPB) process. 26 Even the British military, one of the US’s closest allies, suffered from incompatibility problems. Lack of self-sufficiency in logistics and service support, electronic warfare, and command and control systems were addressed by the provision of US systems and assistance. 27 These examples demonstrate the importance of identifying compatibility deficiencies in in the areas of doctrine, operations and training, and C4I, materiel and technology.
Army, p. 4. htttp://call.army.mil/call 25 See Michele Zanini, p.11. 26 James J. Cooke, 100 Miles from Baghdad: With the French in Desert Storm, Westport, Ct: Praeger, 1993, pp. 5758. 27 See Anthony H. Cordesman and Abraham R. Wagner, The Lessons of Modern War, Volume IV: The Gulf War, Boulder, Co: Westview Press, 1996, pp 143, 158-162.
41
The following sections investigate the Army’s current methods for evaluating MFC and demonstrate that the lack of clear centralized planning guidance from the Army has lead to adhoc evaluation methods which are not conducive to enhancing MFC at the institutional level. While the individual evaluations may help the unit or program, they do not appear to be based on standardized metrics and are not necessarily reflective of a broader Army strategy for MFC.
Doctrine When a military organization lacks a common doctrine it is difficult to achieve unity of effort. 28 The lack of a common doctrine between military forces conducting combined operations can lead to complications across the entire range of military activities. The creation of a common doctrine can ease the strain between two or more nations, allowing them to use the same frame of reference for operations. There is no centralized Army evaluation of MFC issues at the doctrinal level. Army doctrine is developed in a decentralized manner with each proponent agency determining the requirement for including interoperability issues in its publication. Currently, MFC and interoperability are being given greater consideration in some doctrinal areas. However, this change appears to be personality driven rather than standards driven. This observation is based on the lack of centralized guidance or methods for ensuring that MFC issues are incorporated into doctrine. There are several Army activities directed at increasing MFC through doctrinal efforts. The Army participates in the development of joint publications such as the Allied Joint Publications of the NATO alliance. The Army also provides representatives to the SWGs (special working groups) on doctrinal issues for both NATO and ABCA. These representatives
42
discuss interoperability issues and propose issues for standardization. The NATO alliance agreements are published as STANAGs (Standardized Agreements) and are required by AR 34-1 to be included in US Army doctrine. ABCA has a similar program, which produces standardized procedures in the form of QSTAGs (Quadripartite Standardized Agreements). While US Army doctrine should reflect these agreements made with the NATO allies and ABCA partners, there is no Army level evaluation to determine the extent to which this requirement is achieved or to what level it enhances overall Army MFC. A second effort to increase MFC is the creation of NATO and ABCA handbooks and planning guides. These efforts are directed more at the “quick fix” level than at a long-term solution to doctrinal incompatibilities. These publications provide multinational force commanders and staffs with quick reference guides outlining practical information and critical questions to aid in the formation of a multinational task force. Both handbooks and planning guides provide a “workaround” for commanders dealing with multinational task forces. In conclusion, compatible doctrine provides a standard frame of reference for nations when they are conducting combined operations. The Army participates in several activities directed at increasing MFC through doctrine. However, these programs' objectives and evaluations are regional, and centered on alliances and partnerships, rather than reflective of any Army level strategy. While doctrinal efforts aimed at enhancing MFC are evaluated at the program level, there is no strategic level evaluations to determine the combined effects of such programs or to set Army MFC priorities.
Operations and Training
28
See Thomas J. Marshall, ed., Problems and Solutions in Future Coalition Operations., US Army War College,
43
MFC issues in operations and training are evaluated at both the Joint and Army level through exercise and operation evaluations and assessments.
Joint The Joint Staff conducts evaluations through both the J7 and the Joint War Fighting Center (JWFC). The J7 staffs and trains evaluation teams to conduct regional exercise evaluations based on joint doctrine, the CJCS’s objectives, and the CINC's METL tasks designated as training objectives for the exercise. While the joint staff has a framework for evaluating multinational operations, they are currently not conducting multinational assessments. Multinational operations evaluations should begin following the re-designation of ACOM as the Joint Forces Command. The JWFC conducts assessments of multinational operations. The JWFC assessment teams consist of both military and contractor observer/analysts. The details of the assessment are provided strictly to the commander of the exercise. However, should the team note a significant trend they ask for permission to address the issue at a higher level. The standards for the assessment are developed during a workshop prior to the exercise. The commander, staff and assessment team agree upon measures to utilize in evaluating the units achievement of the training objectives. The standards come from the UJTL and the JMETL. While neither the J7 or the JWFC evaluation/assessment teams solely address MFC, MFC issues are assessed in the larger context of evaluating the CINC’s METL tasks. Additionally, the JWFC is developing plans for integrating MFC/multinational interoperability assessments into its future activities. They will likely model future multinational exercise assessments on the recently completed “Capabilities Based Interoperability Assessment Report”
Strategic Studies Institute, December 1997, p.70
44
for Joint Task Force Exercise 99-1. This JWFC report concentrated on assessing US Armed Forces joint interoperability. The assessment team drafted a set of required operational capabilities (ROCs) to be assessed. The ROCs reflect interoperability capabilities required by the joint force to accomplish joint operational (OP) or tactical (TA) tasks identified in the Universal Joint Task List (UJTL). These ROCs were then put in terms of the JV2010 operational concept terms. The ROCs were then analyzed with the intent of identifying issues and challenges associated with interoperability among the JTF and its components. The analysis included identifying operational capability issues associated with doctrine, organization, training, material/equipment, leadership, and people (DOTML-P). The final assessment provided significant findings for improving joint interoperability. Development of similar ROCs for MFC would provide a method for evaluating multinational exercises and operations to achieve greater levels of MFC. Army The Army also provides evaluation and assessment of training exercises and operations. These Evaluations are conducted at the CTCs (CMTC, JRTC, and NTC) and by assessment teams from the Center for Army Lessons Learned (CALL). As with the Joint evaluations, MFC issues are only evaluated in so far as they are incorporated in the METL tasks and training objectives of the commander or driven by program requirements, i.e., Partnership for Peace (PfP). Rather than being driven by an Army strategy for MFC, these assessments are on an ad hoc basis, driven by the Chain of Command’s objectives and the requirements of the evaluation team involved in the performance assessment. The performance measures and the procedures for evaluating MFC activities are based on the Commander’s mission essential task list (METL) and
45
are generally regional or program directed rather than reflective of an Army level standard for MFC. Lessons Learned A second important part of the process of evaluation is consolidating and analyzing the lessons learned from both exercises and operations. As with evaluation, collection of MFC lessons learned occurs on an ad hoc basis. There are no specific guidelines for what constitutes an MFC activity or how it should be evaluated. Thus, the MFC lessons learned are captured only in the context of the specific operation and their relationship to the commander’s objectives for the training exercise or operation. The Army has participated in a multitude of multinational operations over the past decade. Lessons learned were gathered through both active (assessment teams) and passive (forwarding of after action reviews through the chain of command) methods of collection. While there appears to be a significant amount of data/lessons learned on multinational exercises and operations, there is very little if any, broad based analysis of MFC activities. Both the Joint Universal Lessons Learned (JULLS) program and the Center for Army Lessons Learned (CALL) databases contain lessons learned from multinational operations. However, there has been very little cross cutting analysis to identify trends or problems related to MFC issues. Additionally, the data bases, while outstanding for consolidating data, are still not “user friendly� in terms of search engines to identifying cross cutting trends or problems related to a specific topic, i.e. MFC. To facilitate the use of lessons learned in the enhancement of MFC, the Army could designate a database for the primary purpose of consolidating multinational lessons learned. Additionally, assigning analysts trained and dedicated to evaluating multinational lessons learned would significantly improve the value of maintaining such a database.
46
C4I, Materiel and Technology MFC program evaluation processes in the areas of C4I, materiel and technology cooperation have also been subject to little centralized guidance and assessment. To be sure, programs such as ABCA evaluate their own activities on a regular basis.29 However, the large number of MFC initiatives being managed by several agencies do not seem to rely on Armywide criteria and metrics for performance evaluation. International cooperation in the field of C4I is a case in point. The Army participates in several international C4I fora—a recent Army Digitization Office briefing lists more than ten different initiatives. 30 Army bodies such as such as TRADOC, the Army Digitization Office (ADO, part of Office of the Director of Information Systems for Command, Control, Communications, and Computers, or DISC4) and the Test and Evaluation Command (TECOM, part of the Army Materiel Command, or AMC) are responsible for the implementation and evaluation of these programs. While coordination among these agencies and DUSA-IA occurs, contact is often on an ad hoc basis, leaving little room for regular assessments of the overall process. Recent moves to reduce the fragmentation of responsibility for international C4I programs should help the evaluation of efforts in this functional area. For instance, the recently
29
ABCA provides a top-down system of monitoring and evaluation. Each standardization task is given a milestone date for completion. Every six months the Washington Standardization Officers (WSO) group check progress towards the achievement of these milestones, using a Performance Indicator Reporting System (PIRS). 30 These include the C4I agencies of NATO’s Military Committee, the C4I-related groups under the NATO’s Conference of National Armaments Directors, the Army Tactical Command and Control Information Systems (ATCCIS) under SHAPE, the Quadrilateral Armies Communications Information Systems Interoperability Group (QACISIG), Artillery Systems Cooperation Activities (ASCA), the Low-Level Air Picture Interface (LLAPI), the ABCA working group on C4I, the C4I Defense Data Exchange Program (DDEP), the Multinational Interoperability
47
created C2 Systems Interoperability Program (C2SIP) integrates a number of key initiatives such as the Multinational Interoperability Program, and designates DISC4 as the lead agency within Army Headquarters (the European Command is also a sponsor). The program does contain a number of evaluation standards, including a schedule of periodic tests of data exchange capabilities among the C4I systems of participating armies. The International Digitization Strategy, formulated by the Army Digitization Office, also provides more scope for goal-setting and the evaluation of such goals at the agency level. The digitization strategy seeks to involve allies in the development of shared C4I systems and procedures through information exchange, standardization, cooperative programs, and technology leveraging. As the overseeing agency, DISC4/ADO monitors the progress toward achieving a set of objectives, including seamless communication across all systems, dynamic network management, friendly and opposing force location updates, asset availability, real-time engagement planning, “ruggedized� hardware integrated well into battlefield systems, and improved protection from information warfare. DISC4 and the Army Digitization Office have emerged as the key players in the management and assessment of C4I activities. However, the link between these and other Army MFC programs, as well as with DUSA-IA, remains tenuous. Country targeting and prioritization Individual programs also prioritize and evaluate their initiatives across countries, but more in an ad hoc fashion rather than as part of a unified MFC strategy. Individual agencies such as SARDA have adopted a set of criteria to measure the science and technology cooperation potential of a number of countries. Based on evaluations of comparative demonstrated technical performance, indicators or quality, strength balance of supporting infrastructure, and expert
Program (MIP), and regular C4I staff talks. See Tactical Coalition Interoperability, ADO Briefing, March 16, 1999.
48
consensus, SARDA concluded that a high potential exists for cooperation with Britain, France, Germany, and Japan, and to a lesser extent with Russia, the Ukraine, Israel and Canada. 31 AMC has also developed a three-tiered system it to uses categorize partner countries according to their capabilities and needs. 32 Prioritization by country in the area of C4I is de facto occurring, with a number of key allies such as Britain, Canada, France, and Germany involved in the most ambitious interoperability programs. The selection of target countries in the area of C4I reflects an implicit assessment of which armies the U.S. would likely fight alongside with, and which armies have a sophisticated enough C4I system to make compatible with the U.S. Army’s. However, such criteria do not seem to be part of a formal prioritization and selection process.
3.3 Conclusion In conclusion, the framework (both objectives and procedures) for evaluating MFC issues in the operational and training area exists in the form of the METL tasks and the assessment teams at both the joint level (J7, JWFC) and the Army level (CALL, CTCs). Assessment is occurring at both the regional and program levels. CINCs and lower level commanders use training exercises and evaluations to enhance regional MFC. The PfP conducts training exercise evaluations to enhance MFC between the partner nations. However, the extent to which these training events reflect an Army MFC strategy, and enhance Army MFC is unknown. The lack of standardized guidelines and criteria for evaluation make consistent collection of and analysis on the effectiveness of MFC training and operational activities difficult. The lack of a multinational
Available on the Internet at: http://www.ado.army.mil/Br&doc/TIC/TCI/index.htm 31 See the 1998 Army Science and Technology Master Plan for more detail. 32 The development of the International Agreement Tracking System (IATS) should also aid AMC in the assessment of the progress achieved in MFC cooperation with several countries. IATS is a computer system designed to expedite the electronic staffing, approval, and management of International Agreements (IAs). Army personnel are
49
lessons learned data base further complicates the process of evaluating MFC at the Army level. C4I, materiel and technology areas appear to face the same problem of a lack of standardized metrics at the Army level. The lack of a guiding MFC policy and a set of standards for evaluation creates significant problems for enhancing Army MFC. While evaluations are most certainly occurring at the unit and program level, there is no assurance that the different units or programs are following a specified Army MFC strategy. For example, two US units may not be able to communicate across secure channels with partner countries in their area of operations. One unit may identify this as a problem and choose to fix it by lending equipment and training the partner countries in its area of operations. The other unit may not see it as a real problem and decreases its own secure capabilities by agreeing to use single channel communications versus multi-channel frequency hopping. An MFC policy which dictates that communicating in secure modes with all partner countries is a priority would help alleviate this problem prior to an operation. By focusing on this issue during joint training and having a standardized method for evaluating communications including distance, number of channels, speed and radio procedures would allow all units to meet the same standards, enhance MFC and therefore increase the probability for coalition operational success.
III. THE DECISION SUPPORT TOOL - ENHANCING ARMY MFC
1. Introduction
able to determine what is ongoing with partner countries, and who to contact all relative to specific technical development and testing issues.
50
The Army continues to utilize ad hoc methods of working around identified incompatibilities to reach a minimum level of effectiveness prior to multinational operations. This method is neither the most efficient nor the most effective for allowing coalitions to reach their maximum potential either for specific operations or over time. An optimal solution would involve extensive peacetime cooperation to develop combined capabilities between countries, allowing the Army to rapidly build an efficient and effective coalition force prior to deployment. As the range of missions and possible coalition partners is extensive, it is not likely that the Army could coordinate at a maximum level with all possible partners for all possible mission scenarios. Additionally, as the budget for such activities is limited, resources must be dedicated to the most cost-effective activities. Thus, the best approach for developing a successful system for enhancement of Army MFC includes the following steps: First, identify and prioritize likely coalition partners, or countries willing to participate with the U.S. in ground force military operations; next, determine the critical compatibility deficiencies with the likely coalition partners; and lastly, determine the costs of improving compatibility in the deficient areas and focus funding and efforts towards peacetime coordination in those areas with the greatest cost effectiveness. The fluid nature of the international situation requires a flexible system that allows periodic change and re-evaluation. As already discussed, lack of budget data makes the final step currently unachievable. However, the first two steps will go a long way to enhancing Army MFC by providing a blueprint for Army MFC activities. The development of an Army MFC strategy and corresponding assessment methods would significantly increase MFC. An interoperability assessment would accomplish three significant objectives. First, it would enhance the Army’s ability to inform political leaders of
51
the capabilities and limitations of the multinational force. Second, an assessment would allow the Army to provide the DOD and State Department with information to aid in coalition recruiting efforts. Lastly, a measure of MFC would enhance the Army’s and the multinational force commander’s abilities to determine the level of integration and mission type for an ally or friendly force. 33 Identification of compatibility shortfalls would also provide information to politicians, who could use control over the mitigation measures as leverage in bilateral relationship negotiations, for example, tying technology sharing or equipment sales to an agreement in which the partner country agrees to enhance its self deployment or logistics capability in return for communications equipment.
2. Identifying Likely Coalition Partners This portion of the report was completed by two team members for the original RAND report. The following write-up provides a synopsis of their work creating a method for prioritizing possible coalition partners. The first step is to identify and prioritize likely coalition partners. There are several problems that affect the accurate identification of the makeup of possible coalitions. At the heart of the matter is the collective action problem, or the paradox that even actions that are clearly in the collective interest will not be undertaken because individual-level incentives act to curtail participation in that action. The problem arises in conditions of “non-excludable public goods”, where the benefits of an action cannot be excluded from applying to all members even if only a few have contributed to creating the benefits. Thus, each actor has an incentive to “free ride”. There are several methods for encouraging participation. First, the countries involved can
33
All of the above mentioned objectives are derived from the text of FM100-8. The Army in Multinational
52
privatize the benefits or in other words, restrict the access to benefits to those who contributed to their production. A second method is to link participation to an additional benefit, separate from the benefits derived from participation in the operation. Mancur Olson explains that collective action occurs because there is “some separate incentive, distinct from the achievement of the common or group interest, which is offered to the individuals on the “condition that they help bear the costs or burdens involved in the achievement of the group objectives.” 34 The collective action problem applies to the formations of coalitions under the current international system. The US has a preponderance of military power. Given the enormous military strength of the US relative to any other single state, if the US makes it clear that is has an interest in a specific operation, other states will have an incentive to free ride and let the U.S. bear the majority of the burden for conducting the operation. This is particularly true of the recent peacekeeping and peace enforcement coalition operations such as Somalia, Bosnia and most recently Kosovo, where the US supplied 95% of the air power. In these types of operations, diffuse security benefits make it difficult to recruit countries that are not directly threatened by the instability. To determine which states will overcome this collective action problem, it is important to identify which states stand to benefit from participation. These benefits may come in many forms: monetary payments for participation; prestige associated with participation; strengthening of ties with the U.S. (system hegemon) by “bandwagoning”. There are several indicators, both direct and proxy, which help to determine which states will overcome the collective action problem to participate in coalition operations with the US at the lowest cost. Past participation in coalition operations, congruity of interests (as indicated by UN voting record on both general
Operations, November 97. 34 Mancur Olson, The Logic of Collective Action, (Harvard University Press, Boston, MA, ), p.2.
53
and “important” 35 issues), and common alliance membership and common ideology (democratic political system as indicated by the Freedom House Rating) are all indicators for a country being relatively more likely than not to join with the U.S. in a coalition operation. By quantifying these categories we can develop a classification and prioritization for which countries are more likely to coalesce with the United States. 36 Some might argue that these factors bias “the results in favor of raising the importance of Europe and of NATO, since four of the seven major previous coalition operations we considered were NATO actions and took place in Europe. 37 While the bias and the consequent emphasis on Europe and the European Allies is true, it is also beside the point. Europe constitutes an area of declared USA strategic interest, the US has alliance and other security commitments to many countries in Europe, and it is no accident that most of the operations took place on the European continent.” 38 For further clarification, while this portion of the project resulted in a hierarchy of countries, the MCAT decision support tool derived in this policy project is not limited to analysis of the specific hierarchy from the above mentioned report. Rather, it can be and is recommended for use in analyzing not only world level partner countries, but also specific regional areas of interest for MFC compatibility deficiencies.
3. Measuring Compatibility Deficiencies The second step is, given the likely coalition partners, to determine the compatibility deficiencies between those countries and U.S. ground forces. The Army requires but currently
35
“Important” as identified by the U.S. State Department. The country prioritization portion of the report was written by Olga Oliker and Michele Zanini with oversight by Thomas Szayna. See Thomas S. Szayna, Frances M. Lussier, Krista Magras, Olga Oliker, Michele Zanini, and Robert Howe, Improving Army Planning for Future Multinational Coalition Operations, RAND DRR-2174-A, Sept 1999, p.4, for the country prioritization results. 37 The four NATO operations in Europe were : Deliberate Force (Bosnia-Herzegovina), Joint Endeavor (IFOR, Bosnia-Herzegovina), Allied Force (Kosovo), Joint Guardian (KFOR, Kososvo). The non-NATO and nonEuropean operations were : Desert Storm (Gulf War), Restore Hope (Somalia) and Uphold Democracy (Haiti). 36
54
does not use a standardized method of analyzing deficiencies so that a country’s compatibility can be compared on a relative standard to other likely coalition partners. From this relative assessment, the Army can make decisions about which MFC enhancement activities to support and fund.
3.1 The Military Compatibility Assessment Tool In 1998 RAND designed the DynaRank decision-support system, a Microsoft Excel workbook available for Macintosh and IBM compatible computers, to assist Department of Defense decisionmaking. DynaRank ranks policy options by cost-effectiveness, based on the relative importance of objectives and a variety of success criteria. Although DynaRank aimed to assist the DoD’s high-level resource allocation decisionmaking, its strong sensitivity to strategy, amenability to a variety of data (subjective judgements as well as quantitative analyses), and ability to link several levels of analysis makes it useful in supporting other types of defense planning. 39 By substantively modifying the DynaRank system, we developed the Military Compatibility Assessment Tool (MCAT). The tool aims to assist DUSA-IA in making strategic level MFC policy decisions. MCAT does so by pinpointing the extent of compatibility between US ground forces and other countries’ armies in specified critical areas, thereby providing a rationale for MFC policy choices. Linking specific MFC assistance policies with the results from MCAT also aids the decision-making process for allocating resources to MFC activities.
38
SeeThomas S. Szaynz, Frances M. Lussier, Krista Magras, Olga Oliker, Michele Zanini, and Robert Howe, Improving Army Planning for Future Multinational Coalition Operations, RAND DRR-2174-A, Sept 1999. p.43. 39 For further details on DynaRank, refer to Richard J. Hillestad and Paul K. Davis, Resource Allocation for the New Defense Strategy: The DynaRank Decision-Support System, RAND, 1998.
55
MCAT analysis further differentiates between the countries determined as likely to participate with the US in coalition operations (on the basis of the propensity to ally). It provides a means of evaluating the existing level of military compatibility of select foreign ground forces relative to the US Army in a variety of critical mission areas. In addition, MCAT allows for variable weighting of requirements based on different missions, making the tool flexible and adaptable across the mission spectrum. An MCAT analysis produces a hierarchy of countries based on their ground forces’ compatibility with the US Army and identifies both areas of greatest need and potential bottlenecks in coalition operations with the US. It does so in a relative fashion, allowing for comparisons among the countries while judging all against the US as the standard point of reference. The hierarchy provides input to assist DUSA-IA in making decisions about which countries and which capability assessment areas (CAAs) need attention in order to operate more effectively in coalition operations with the US. MCAT aids decisionmakers in pinpointing the most effective policies for enhancing MFC.
3.2 METHODOLOGY The MCAT provides a method for general assessment of a foreign army’s current capabilities and associated compatibility with US ground forces. This flexible tool enhances the user’s ability to evaluate multiple countries based on a variety of criteria; change criterion weights according to mission requirements; and consolidate the evaluations for comparison. Initially, the user conducts an analysis of a country’s capabilities and provides an assessment (score) of the related compatibility in the designated capability assessment areas (CAAs). Once the country assessments are complete, the user opens the MCAT workbook, which consists of three worksheets: The Country Data worksheet; the MFC scorecard; and a Scorecard Functions explanation sheet. The user enters the scores onto the Country Data
56
worksheet, which automatically transfer to the MFC scorecard. Utilizing the MFC scorecard, the user sets CAA weights based on mission requirements, and colors the scorecard. In the final step, the user determines the type of analysis by appropriately flagging the desired compatibility measures (CMs) for each country and then ranking the countries by effectiveness. The effectiveness ranking creates a hierarchical listing of the countries by “overall” compatibility or compatibility in any one of the three CMs. The quality of the MCAT results depends upon the quality of the input information. The underlying country evaluations must be both credible and understandable. It is highly recommended that the same set of individuals provide the analysis for each region, and across regions if possible so as to provide continuity and consistency of grading across the countries.. I The evaluations, while primarily subjective, may come from specific detailed analysis, spreadsheet models or other quantitative methods, or assessments based on general and specific knowledge and experience. The sensitive nature of the information required to determine a country’s military capability and compatibility with US ground forces makes consolidated data Date of ## 0 3 Analysis: in unclassified form difficult to obtain. However, the Military Capabilities Studies conducted for Color Blank Cells
BATTLEFIELD FUNCTIONS
High Color 3 Value
Fire Support
CSS
C4I
Air Defense Mob/Surv Logistics
Wt or
Deployability
specific countries by the Maneuver Defense Intelligence Agency, supply current and accurate analysis of a CAAs C2 Commo Intel 1
1
1
1
1
1
1
1
1 Wt
0 Low 1 Color Value
Wt
Wt country’s capabilities from which to make100assessments. Additional sources of information Goal --> 100 100 100 100 100 100 100 100 100
Flag.
Graph
Compatibility Score Column 0
No. COUNTRY include the plansCM and intelligence sections of Joint, CINC and Army staffs, CIA country studies, 1
1
Country 1
Doctrine
1
3
3
3
2
2
2
3
2
3
2.6
military attaches at US embassies, and open sources such as Jane’s publications. 1
2
Country 1
Technology
1
2
2
2
2
1
2
2
2
3
2.0
1
3
Country 1
OR/TRNG
1
2
2
2
2
1
1
2
2
2
1.8
1
4
Country 1
Overall
1
2.3
2.3
2.3
2.0
1.3
1.7
2.3
2.0
2.7
2.1
1
5
Country 2
Doctrine
1
2
2
1
1
2
1
2
2
1
1.6
1
6
Country 2
Technology
1
2
1
1
1
1
0
2
1
1
1.1
1
7
Country 2
OR/TRNG
1
2
1
1
1
1
0
2
1
1
1.1
1
8
Country 2
Overall
1
2.0
1.3
1.0
1.0
1.3
0.3
2.0
1.3
1.0
1.3
1
9
Country 3
Doctrine
1
1
1
1
1
2
2
1
2
1
1.3
1
10
Country 3
Technology
1
2
2
1
1
1
1
2
2
1.6
1
11
Country 3
OR/TRNG
1
1
1
1
1
1
1
1
1
1
1.0
1
12
Country 3
Overall
1
1.3
1.3
1.0
1.3
1.3
1.3
1.0
1.7
1.3
1.3
572
Figure 3.1 – Completed MFC Scorecard.
3.3 Basic Structure of the MFC Scorecard The MCAT, an Excel based tool, consists of three worksheets. The user enters data on the base level worksheet for each country in nine capability assessment areas (Maneuver, Fire Support, Air Defense, Mobility/Survivability, Logistics, Deployability, Command and Control, Communications and Intelligence) evaluated by three compatibility measures (Doctrine, Technology, Operational Readiness/Training).
The data automatically transfers to the MFC
scorecard, which presents the aggregate results of the assessment. Figure 3.1 illustrates a completed MFC scorecard. The scorecard illustrates both individual and aggregate levels of information. First, the “overall row” provides a country’s overall level of compatibility in each CAA. Additionally, the aggregate column to the far right of the scorecard provides each country’s aggregate compatibility for each CM. This composite scorecard also provides a ranking of countries based on the compatibility assessment. Through a simple process of re-weighting the flags located at the left of the first column, MCAT creates a hierarchical country ranking reflecting “overall compatibility” or compatibility by CM. Additionally, the scorecard allows a comparison of countries, not only by overall level of compatibility, or CM, but also by levels of compatibility within the individual capability assessment areas.
Capability Assessment Areas (Columns)/Compatibility Measures (Rows)
58
We derived the MCAT criteria, both CAAs and CMs, by examining multiple reports on past multinational operations and a variety of Army publications. 40 While these criteria provide a valid and critical examination of a country’s capability/compatibility, the strength of the MCAT lies in the ability to change the criteria if an examination of the underlying assumptions proves they are no longer valid or require change as the situation may dictate. While this will complicate the process by requiring further analysis of a country in the new area, updating both the criteria and the data set on a regular basis (recommend bi-annual) will allow DUSA-IA to stay on top of changes affecting MFC and continue to provide an effective strategic policy.
Utility Values/Body of the MFC Scorecard MCAT does not automatically complete the large analytic task of filling out the body of the scorecard. The user must make a subjective assessment of a country’s capabilities and then relate the assessment to a compatibility score. The underlying assumption for the country analysis is that a country’s capabilities directly relate to its compatibility with US ground forces and therefore provide the best method for evaluation. Differences in nations’ military capabilities lead to asymmetry between forces. This asymmetry, whether doctrinal, technological or procedural will complicate interoperability
40
Publications include Army Field Manuel FM100-5 Operations, June 1993; Army Field Manuel FM100-8 The Army in Multinational Operations, November 1997; Allied Joint Publication AJP-01(A), Allied Joint Doctrine, March 1999; Joint Task Force Commander’s Handbook for Peace Operations-Joint Warfighting Center, February 1995; Roger H. Paulin, Multinational Military Forces: Problems and Prospects, Institute for Strategic Studies, Adelphi Paper 294, 1995; Thomas J. Marshall (ed.), with Phillip Kaiser and Jon Kessmeire, Problems and Solutions in Future Coalition Operations, U.S. Army War College, Strategic Studies Institute, December 1997; Martha Maurer, Coalition Command and Control, National Defense University, Institute for National Strategic Studies and the Center for Advanced Concepts and Technology, 1994; Michele Zanini and Jennifer Morrison Taw, Force XXI: Implications for Multi-Force Compatibility, RAND, DRR-1961-A, September 1998. Elwyn Harris, Stephanie Cammarata, Jody Jacobs, Lewis Jamison, Iris Kameny, and Paul Steinberg, Joint/NATO Interoperability Issues for Force XXI: Final Report for FY98, RAND, AB-215-2-A, August 1998.
59
and hinder coordination. In evaluating these asymmetries, we classify countries in terms of compatibility. While similar capabilities often signal an increased level of compatibility between countries, we recognize this is not always the case. Equality of capabilities between nations does not necessarily equate to compatibility between nations. For example, two countries may have the same type of equipment, but use it for different missions.41 In-depth analysis of this type of asymmetry is important when making detailed decisions about individual nations. However, when evaluating relative compatibility, so as to make strategic decisions about MFC policies, the concept can be simplified, allowing for cross-comparisons between nations. We assume that a foreign army with more advanced capabilities in a particular CAA has a higher level of compatibility with US ground forces relative to a country with little or no capability in that CAA. Countries with similar equipment and doctrine will have a higher compatibility level than two countries with dissimilar equipment or doctrine. This assumption allows us to make subjective judgments on compatibility and cross comparisons between nations. The aforementioned reasoning led to the development of four levels of compatibility. The compatibility measures range from “No Capability-0” to “High Capability-3” Figure B.2 presents both the levels of compatibility and the number/color scheme used on the MFC scorecard.
0 1
No capability (no compatibility) Low capability (Long term fixes required for compatibility)
41
Concept of asymmetry taken from Thomas J. Marshal ed, Problems and Solutions in Future Coalition Operations, Steven Metz, Chapter 3.
60
Medium capability (Short term fixes and workarounds will ensure compatibility)
2
High capability (Highly compatible/Mission ready)
3
Figure 3.2 – Compatibility Levels Note that the definitions for levels 1 and 2 (orange and yellow) derive from the ability to increase compatibility either through long term fixes (orange) or short term fixes and workarounds (yellow). A 1998 RAND study on MFC entitled, Force XXI: Implications for Multi-Force Compatibility derived the following definitions to explain long and short term fixes. Broadly defined, short term fixes or workarounds represent those steps taken prior to a deployment, once such deployment becomes a distinct possibility (i.e., during the planning phases of an operation). Long term fixes equal long-term solutions sustained across operations.
3.4 Weighting of Capability Assessment Areas MCAT allows for varying the relative weights of the CAAs. Initially, MCAT generates results with CAAs of equal weights. However, if the user determines the CAAs are unequal in their impact on compatibility, the MCAT allows for re-weighting of the CAAs to represent their relative importance. The weighting of the CAAs for MFC is most likely to vary based on the following mission variables: 4. High versus Low Intensity Conflict 5. Long Lead Time versus Short Lead Time (Time between deployment of forces and commencement of operations) 6. Level of Integration(Div, Bde, Bn, Co) 7. Command Structure (Lead Nation, Parallel, Integrated) 61
For example, a high intensity conflict with a short lead-time might require greater levels of compatibility in certain areas. Intelligence and deployability may carry more of an impact than in a mission scenario with a long lead-time. In a low intensity conflict scenario with a high degree of integration, C2 and communications may be more important than fire support or air defense. The MCAT has the ability to reflect these decisions. While weighting of CAAs provides a useful method of evaluating MFC across different mission types, it is subjective and the user must understand the limitations of re-weighting the CAAs. Any number of different results can be obtained by manipulating the weights. Therefore adjusting the weights of the CAAs is more apt to facilitate the finding of robust options among the many different alternatives for MFC funding than to ensure that the “correct” options are highlighted.
3.5 Flagging of Compatibility Measures To facilitate different levels of analysis, the user may utilize the flagging function to determine the type of hierarchy MCAT produces. Based on the flagging, MCAT produces country rankings by either overall compatibility or by any one of the three CMs (Doctrine, Technology, or Operational Readiness). See appendix B, Figure B.3 for an example of flagging the CMs for analysis.
3.6 Linking MCAT Data to MFC Assistance Policies The previously mentioned 1998 RAND study on MFC analyzed the effects of Force XXI on MFC in future coalition operations. An intensive investigation of four major multinational operations revealed the repetitive use of certain “mitigating measures” to overcome compatibility
62
problems between the US and its coalition partners. Figure 4.3 outlines the results of the study’s findings. The study also demonstrated the value of engagement over the long term, or the advantage of fixes over workarounds. Such initiatives as combined training, multilateral command post exercises, technological research and development, doctrinal development, and intelligence-sharing protocols will be increasingly important as forces with vastly different capabilities attempt to coordinate their efforts. 42 Linking the MCAT results to this study reveals several significant policy applications. MCAT analyzes a country’s compatibility with US ground forces relative to other possible coalition partners and identifies shortfalls in compatibility, categorizing them by both capability area and compatibility measure. Not only can the user identify a compatibility shortfall by its functional area, (i.e., intelligence, C2), he can further identify whether the problem is a doctrinal, technological or operational readiness shortfall. Utilizing the data from the Force XXI study, the user can pair the shortfall with a mitigating measure.
3.7 Operationalizing the MCAT for Analysis The methodology used to develop the MCAT provides a sound backdrop for analysis of MFC. However, as the Army utilizes this decision support tool, the capability assessment areas and compatibility measures should be re-examined in accordance with current mission experiences so as to ensure that all key factors for MFC are covered. This report reviewed a series of missions from the last decade and the current design of the MCAT concentrates on traditional military battlefield functions. It may be that in the Army’s near future, more emphasis should be placed on non-combat categories stemming from the peacekeeping and
42
Zanini and Taw, Force XXI: Implications for Multi-Force Compatibility.
63
peace enforcement experiences in Bosnia and Kosovo. These categories might include “mediation and negotiation capability” or “policing” as critical MFC requirements. DUSA-IA should consider organizing a panel of experts who can critically examine and debate the areas most critical to MFC requirements based on the most recent mission experiences.
64
Figure 3.3 Framework for Deriving Mitigation Measures (Fixes in Bold) Problems
Operational
Ad Hoc, High or Low Intensity,
Ad Hoc, High or Low Intensity,
Alliance, High or Low Intensity,
Long Lead-time
Short Lead-time*
Long or Short Lead-time
•provide C4I, •liaisons; •IMET,
•provide C4I, •liaisons; •develop intel-
•provide C4I, •liaison; •develop comb.
•predeployment MTTs; •develop intel-sharing
sharing protocols
exercise training & intel-sharing protocols
protocols C4I Organizational
Technological
•establish lead nation C2 structure,
•establish lead nation C2 structure,
•integrate C2 structure, forces;
•geographic separation; •set up C3IC
•geographic separation; •set up C3IC
•partly rely on geographic separation
•loan/share/sell equipment; •rely on lowest
•loan/share/sell equipment; •rely on
•loan/share/sell equipment; •rely on
common denominator (LDC), COTS, SATCOM
LDC, COTS, SATCOM wher e not c ompr omi s ed
where not compromised
Operational
LDC, •jointly-develop equipment; •rely COTS, SATCOM where not compromised
•phase deployment; •provide logistics &
•phase deployment; •provide logistics &
•implement comb. total asset visibility
•lift; •pre-position materiel; •lease lift,
•lift; •pre-position materiel; •lease
(TAV); •provide logistics & lift; •pre-
local transportation
local transportation
position materiel; •lease lift (if long lead-time), local transport.
Logistics & Deploy-
Organizational
•establish geographic separation;
•establish geographic separation;
•develop comb., complem. lift and
•stovepiping
•stovepiping
logistics procedures; •or stovepipe
•loan/share/sell equipment
•loan/share/sell equipment
•share, co-develop TAV; •coordinate
•provide liaisons; •IMET; •predeployment
•provide liaisons; •IMET; standardized
•develop comb. doctrine, training,
MTTs, standardized and pre-deployment
exercises; •invite LNOs to TRADOC, War
exercises, exchanges, ect; •provide
ability
procurement to ensure compat.
Technological
Operational
exercises; •invite LNOs to TRADOC, War
College, other Army centers; •provide
missing capabilities (force protection);
College, other Army centers; •provide
missing capabilities (force protection);
•and compensate in combined planning
missing capabilities (force protection);
•establish a quick reaction force
•establish a quick reaction force
Doctrine, Organizational Procedures &
•establish lead nation C2 structure,
•establish lead nation C2 structure,
•integrate command structure, forces;
•geographic separation
•geographic separation
•partly rely on geographic separation
•loan/share/sell equipment; •rely on COTS
•loan/share/sell equipment; •rely on
•loan/share/sell equipment; •co-develop
COTS
equipment and material; •establish
Employment
Technological
compatibility protocols
*The relative importance of fixes and workarounds changes with the amount of lead-time: with short lead-times, fixes become more important since some workarounds may not be feasible.
65
IV. CONCLUSIONS Given the rising number of joint and combined operations over the past decade, the Army faces the critical task of enhancing ground force performance in coalition operations. Strategic guidance from the President and the Joint Chiefs of Staff directs the Army to prepare for coalition operations, strengthening alliances and coalitions, “to meet the challenges of an evolving security environment”. 43 There are three critical components to effective ground force operations in a coalition operation: US Army capabilities developed under its Title X responsibilities, contributions by coalition partners, and the collective ability to interact effectively to accomplish coalition military goals. This study addresses the third element and provides a method for evaluating the current status of multinational force compatibility with possible partner countries across the spectrum of possible missions in a variety of critical mission functions.
1. Policy Recommendations. Based on this analysis, I recommend that: •
DUSA-IA clearly detail explicit long-term Army MFC priorities. This is a critical function of the office, and should be the first step towards their efforts at enhancing MFC.
•
DUSA-IA and the Army adopt the MCAT methodology as part of the decision support system used to make decisions about MFC priorities, activities and funding.
43
A National Security Strategy for a New Century, The White House, October 1998.
65
•
DUSA-IA organize a panel of experts with recent operations experience to identify the critical assessment areas, so as to operationalize the MCAT for analysis.
•
DUSA-IA develop and push for acceptance of standardized evaluation criteria for identified critical MFC activities, at a minimum within the Army and if possible for all US services.
2. Policy Implications The Military Compatibility Assessment Tool (MCAT) integrates both qualitative and quantitative analysis to provide the Army with useful information for making international activity decisions. It allows the Army to identify which activities with which partners will be the most effective in enhancing MFC. Beyond providing a general analysis of the current worldwide situation, the analysis can be tailored to fit any region or mission type so as to provide specific guidance for contingency planning. Finally it can assist the Army with developing a guiding strategic policy for MFC. All of these factors will allow the Army to play a greater role in the both the operational and budget planning cycles of the JCS and the CinCs. This assessment method can significantly enhance Army MFC planning. An interoperability assessment accomplishes several additional critical objectives. First, it identifies potential problems in MFC in advance of an operation. Second, an assessment allows the Army to provide the DOD and State Department with information to aid in coalition recruiting efforts. Third, it enhances the Army’s ability to inform political leaders of the capabilities and limitations of the multinational force. Finally, a measure of MFC enhances the Army’s and the
66
multinational force commander’s abilities to determine the level of integration and mission type for an ally or friendly force. 44
3. Obstacles to Implementation Both DUSA-IA and the Army will face multiple obstacles to implementing an MFC enhancing policy. DUSA-IA already faces numerous challenges as identified at the outset of this study. Their efforts to enhance Army MFC are hampered by a lack of vision and control over MFC enhancing activities and funding, both of which generally fall in the CinC span of control. They will face a serious challenge from the CinCs, their staffs, and other regional planners when they attempt to set Army strategic policy and guidelines for MFC activities and funding. The best method for overcoming this obstacle is to ensure that all the relevant parties are included in the planning process. It is recommended that the Joint and CinC staffs use the MCAT in their own analyses and then compare answers, or at a minimum, keep them informed of the process and welcome their insights. This will make them more likely to accept the results of the analysis and the decisions about what the strategic policy should encompass. DUSA-IA could gain CINC and Joint staff in several different ways. They could request that a Joint Task Force be created which would include key players from all levels. This Task Force could critically examine the CAAs and CMs to ensure that the underlying assumptions for the current MCAT categories remain valid, and then operationalize the MCAT and conduct any necessary troubleshooting. DUSA-IA could also set up a feedback system, where by they provide their analyses to the CINC and Joint staffs, and ask for critical feedback, especially in the initial stages of operationalizing
44
All of the above mentioned objectives are derived from the text of FM100-8. The Army in Multinational Operations, November 97.
67
the system. Because the CINC and Joint staffs control the funding and scheduling of MFC activities, it is critical that they buy into the Army plan for MFC. The nature of the decisions supported by the MCAT is also likely to bring heat on DUSA-IA from both inside the US military and also from important allies. The MCAT demonstrates shortfalls in a country's capabilities and affects how resources will be allocated to address those shortfalls. Thus certain countries will receive additional attention while others may receive very little. It also provides a relative judgment as to the sophistication of a country's military, which can generate opposition. It is critical that the use of the MCAT be based on credible and understandable criteria and country evaluations. DUSA-IA must develop a team of qualified and impartial evaluators to utilize the MCAT. Additionally, while I do not recommend collaboration with allies when making the evaluation (to avoid the political pressures which would most likely accompany any such collaboration), it is advisable to discuss relevant results with key allies and address deficiencies in a manner acceptable to all parties involved.
4. Next Steps While there are many topics which relate to the enhancement of MFC, the most critical issue derived from this study is the need to determine what are the most critical mission areas affecting MFC and how do they change with different types of operations. The US military faced a significant challenge over the past decade. They went from the clear cut missions and heavy offensive militaries of the Cold War, to the uncertainty of what was originally termed “operations other than war� and is now know more commonly known as peace operations. Not only did the US have to plan for new types of missions, but they also had to learn to work with an abundance of military allies. From operations in Somalia in 1994 to the current operations in
68
Bosnia and Kosovo, the Army has been learning how to work with other militaries. These lessons need to be compiled, studied and synthesized into a comprehensive doctrine on MFC, specifically identifying the critical factors leading to success across the spectrum of coalition operations.
69
APPENDIX A. DIRECTIONS FOR APPLICATION OF THE MCAT
APPLICATION OF MCAT FOR MFC STRATEGIC PLANNING
The following steps outline how to use the MCAT. The remainder of the section provides further details on each of the steps. Steps for completing an MCAT Analysis STEP 1. Fill out Country Data worksheet. STEP 2. Transfer Data to MFC scorecard. STEP 3. Set weights for CAAs on MFC scorecard. STEP 4. Color MFC scorecard. STEP 5. Set flags for type of analysis and rank MFC scorecard for Analysis NOTE: Outlined Steps in Boxes denote computer steps. STEP 1: Filling Out the Country Data worksheet •
Open the MCAT workbook
•
Go to “Country Data Worksheet”
•
Enter the country name over “Country 1”. “Return” will replace all appropriate areas with the country name.
•
Enter country compatibility values into appropriate rows/columns. DO NOT input data into overall column
Country Data Worksheet: The worksheet body contains the information which will be entered on the scorecard at the intersection of the CAAs (columns) and the CMs (rows). The user fills out this portion of the worksheet using primarily subjective assessments of each country based on analytical research of the country's capabilities and their associated compatibility with US ground forces. Recall the four levels of compatibility used for evaluation:
A-1
0 1 2 3
No capability (no compatibility) Low capability (Long term fixes required for compatibility) Medium capability (Short term fixes and workarounds will ensure compatibility) High capability (Highly compatible/Mission ready) Figure A.1 - Compatibility Levels Compatibility measures The user evaluates each CAA using three CMs. These CMs, drawn from a review of
recent multi-force operations after-action-reports and studies and current Army doctrine and regulations, represent the critical measures affecting the compatibility between two forces. 45 While compatibility problems are often attributed to equipment or technology differences between nations, MCAT takes into account the more complicated and realistic view that compatibility is based on a variety of factors. The compatibility measures used in the MCAT are doctrine, technology/equipment, and operational readiness/training.
DOCTRINE: How do they fight? Does their doctrine cover the full treatment of strategic, operational and tactical issues? Does it cover Joint and Combined operations? Does it emphasize Offense, Defense or both? What type of operations does it cover? TECHNOLOGY: What do they fight with? What is their modernization level? OPERATIONAL READINESS/TRAINING: What are their forces capable of? What types of missions are they trained to conduct? What do recent evaluations/exercises demonstrate about the readiness of their forces?
Capability assessment areas The 1998 RAND study on Force XXI and MFC issues categorized the compatibility issues identified in each of the four operations analyzed into three specific areas: C4I, Logistics and Deployability, and Doctrine, Procedures and Employment. 46 For enhanced evaluation purposes, 45
Publications reviewed include: FM 100-5(1993), FM 100-8(1997), Paulin, Multinational Military Forces: Problems and Prospects; Marshall,Problems and Solutions in Future Coalition Operations; Maurer, Coalition Command and Control. 46 Zanini and Taw, Force XXI: Implications for Multi-Force Compatibility,
A-2
we further divided them into nine compatibility assessment areas derived from both the abovementioned study and Army doctrine on multinational operations. 47 The following sets of questions for each of the nine CAAs are designed to guide the user in the capability assessment. 48
1. Maneuver
Doctrine
How do they fight? Is their doctrine Soviet or US based? Primarily
Defensive or Offensive oriented? What is the size of their basic maneuver force? Do they plan for combined arms operations, joint operations, combined operations?
Technology
What is the modernization level of their maneuver equipment? What is the mobility of their formations? What type of communications equipment do their maneuver formations have?
OR/Training. What is the effectiveness of their formations? What is their assessed maneuver capability? Do they have training for operating in special environments?(Airborne, Air-assault, Urban, NBC)
2. Fires
Doctrine
What is their doctrine on allocation of firepower? Who controls their
firepower assets? What are their Fire control measures?
Technology What type of targeting/firepower equipment do they have?
47
FM100-5, FM 100-8. These questions were drawn from analysis of FM100-5 and FM100-8 and most likely do not provide all of the relevant questions. However, including additional questions may unnecessarily complicate the analysis process. There are multitudes of questions pertaining to each CAA and delineation of all of the relevant questions would be overwhelming and unproductive. To complete a general, relative assessment of compatibility across nations with the MCAT, a basic subjective assessment of a nation’s capability is required.
48
A-3
OR/Training What is their capability for long range/massed fires?
What is their capability to
integrate and synchronize fires?
3. Air Defense
Doctrine
What are their priorities for air defense? Who controls their AD assets? What are their fire control measures?
Technology What are their weapons systems types and capabilities? What type of communications equipment do they have? Do they have theater missile defense capability?
OR/Training What are their AD capabilities? Can they integrate their fires into the theater control and reporting center (CRC)?
4. Mobility/Survivability/Force Protection
Doctrine
Does it cover the use of mobility/countermobility measures? Do they plan
for force protection?
Technology What type of engineer equipment do they have? What assets do they maintain for force protection? What type of NBC equipment do they have?
OR/Training What type of mobility/countermobility capabilities do they have? Can they provide survivability measures for their own forces, particularly NBC?
5. Logistics
A-4
Doctrine
Do they have a push or a pull logistics doctrine? Do they plan for RSO&I
operations? What classes of supply can they provide and what are the priority and stockage levels? What types of services do they plan on providing? What are their maintenance standards? How long do they plan on sustaining initial forces? What are their plans for follow on sustainment of forces?
Technology
What is their modernization level for equipment? What is their logistics
mobility capability? What type of system (manual/ADP) do they utilize to track their logistics?
OR/Training What size of a deployed force can they sustain with what classes of supply and services and for how long? What level of maintenance do they achieve?
6. Deployability
Doctrine
Does their doctrine cover strategic deployment of forces? Do they plan for
RSO &I activities?
Technology What modes of transport do they maintain? (Land, Air, Sea, Rail) What technology do they utilize for planning deployment (manual/ADP)?
OR/Training What type of deployment can they conduct? What size force can they self-deploy, to what distance and at what speed?
7. Command and Control
Doctrine
Do they have large staffs with the technical means to support planning (command estimate process) for both current and future operations? Do they maintain battle staffs? Can they process, reproduce and rapidly disseminate operational plans? What is the decision-making authority of the subordinate commanders and staffs (decentralized or centralized decision making processes)? Do they have a flexible
A-5
command structure giving commanders the freedom to execute the mission with minimal guidance? What is their reporting system? What graphics and control measures do they utilize to control the battlefield?
Technology
Do they have the communications, computers and intelligence means to support the commanders decision making process? Do they have the ability to use spacebased systems for reconnaissance, surveillance, navigation, and positioning to facilitate battle command?
OR/Training
What is the assessment of their C2 structure from current exercises and operations?
8. Communications
Doctrine
What type of orders format and dissemination methods do they utilize? Do
they plan for communications capabilities for home station, enroute and in theater operations. Can they operate using English or do they plan for liaisons and translators?
Technology What types of signal support systems are available to the commanders at different levels? Are their systems digital or analog? Do they allow for digital-to-analog interfaces?
Do they have secure communications capabilities and can they tie
into US systems (STUs, SINCGARS, SATCOM) Do they have access to space based systems?
OR/Training What is the assessed level of their communications capabilities? Are their communications systems reliable and can they establish commo channels in a timely manner? What is their level of “US Military English� proficiency (Officers, NCOs, Enlisted)? Do they have liaisons with good English language skills and understanding of US doctrine and procedures? What are their translation capabilities?
9. Intelligence
A-6
Doctrine
What type of collection capabilities/assets do they possess (HUMINT,
SIGINT, IMINT, ELINT), and how do they utilize them in support of political and military objectives?
Technology
What is the sophistication of their collection assets? What technical means do they have produce and disseminate intelligence? Can they link into the US intelligence system? (I.e. SATCOM)
OR/Training What is their capability (sophistication and focus) for processing intelligence? (Collection, production and dissemination.)
Weighting, Coloring and Ranking the MFC Scorecard The remaining four steps describe how to transfer country data to the scorecard and configure it for analysis. Figure B.5 demonstrates the correct data entry points for both weights and flags. STEP 2: Transfer data to scorecard •
SELECT “MFC SCORECARD” WORKSHEET FROM MCAT WORKBOOK. DATA FROM COUNTRY DATA WORKSHEET AUTOMATICALLY TRANSFERS TO SCORECARD.
STEP 3: Set CAA weights on MFC scorecard. •
Set weights for CAAs on scorecard, typing # in correct column.
STEP 4: Color scorecard Note: The user should beware that coloring is not done automatically on a scorecard because of software limitations. Changes made to weights and country values require a resetting of the colors. To remove color, repeat the first two steps and select “Uncolor Scorecard”. 49
49
Hillestad and Davis, Resource Allocation for the New Defense Strategy. The DynaRank Decision-Support System.
A-7
•
Go to Microsoft Excel Toolbar Menus
•
Select “SC Tools” and select “Color Scorecard”
Color Blank Cells
BATTLEFIELD FUNCTIONS
High Color Value 3 0 1
CA A s
Maneuver 1
d e e
1
1
1
Mob/Surv
1
C4I Wt or Min
Deployability
Logist ics
1
1
1
C2
Commo
1
Intel
1
1
1 Wt Wt
100
COUNTRY
1 Country 1 2 Country 1 3 Country 1 4 Co u n try 1
CM
100
100
Flag.
No.
CSS
Air Defense
Low Color Value Goal -->
1
Fire Support
100
100
100
100
100
100
CAAWeight s
Wt 100 Compatibility Sc ore Column 0
Doct rine
1
3
3
3
2
2
2
3
2
3
2.6
Technology
1
2
2
2
2
1
2
2
2
3
2.0
OR/TRNG
1
2
2
2
2
1
1
2
2
2
1.8
Overal l
1
High Color
2.3
Flags 2.3
2.3
2.0
1.3
1.7
2.3
2.0
2.7
2.1
Low C Value
Figure A.2 – Partial MFC Scorecard Demonstrating Weight and Flag Options
A-8
STEP 5: Set Flags for Type of Analysis and Rank MFC Scorecard Note: To rank the countries, place the same number in the flag column of all of the options you want ranked together, and place a different lower number in the remaining rows. For example, to rank by overall category, place a 1 in the overall category for each country and a 0 in all other rows. It is possible to rank all similar categories hierarchically on the same card. See figure B.3 for an example. •
Set flags for type of analysis, typing # in the appropriate flag column.
•
Reset program by selecting the Country Data Worksheet and then Selecting the MFC Scorecard.
•
Go to Microsoft Excel Toolbar Menus
•
Select “SC Tools”
•
Select “Rank”
•
Select “By effectiveness” (DO NOT select “By Cost” or “By Cost Effectiveness”) 50
To Return to the Original Country Rankings: •
Go to Microsoft Excel Toolbar Menus
•
Select “SC Tools”
•
Select “UnRank”
•
Use Slider Bar at bottom right hand corner of screen to drag picture back to the left to the MFC Scorecard.
50
These functions are a DynaRank option not adapted to MCAT analysis.
A-9
APPENDIX B. SAMPLE MCAT APPLICATION SAMPLE MCAT WORKSHEET AND SCORECARD
In the following example, I apply the MCAT process to three countries, providing specific capability/compatibility analysis for Country “1” and sample scores for Countries “2” and “3”. Figure B.1 provides an example of a completed Country Data Worksheet. Figures B.2 and B.3 provide MCAT scorecards based on the compatibility analysis of the three countries. Sample Country Data worksheet and Country “1” Assessment.
Maneuv er
Fire Support
Air Defense
Mob./Surv
ep oyab ty
Logistics
C2
Commo
Intel
Country 1
Doctrine
3
3
3
2
2
2
3
2
3
Country 1
Technology
2
2
2
2
1
2
2
2
3
Country 1
OP/TRNG
2
2
2
2
1
1
2
2
2
Country 1
Ov erall
2.3
2.3
2.3
2.0
1.3
1.7
2.3
2.0
2.7
Country 2
Doctrine
2
2
1
1
2
1
2
2
1
Country 2
Technology
2
1
1
1
1
0
2
1
1
Country 2
OP/TRNG
2
1
1
1
1
0
2
1
1
Country 2
Ov erall
2.0
1.3
1.0
1.0
1.3
0.3
2.0
1.3
1.0
Country 3
Doctrine
1
1
1
1
2
2
1
2
1
Country 3
Technology
2
2
1
2
1
1
1
2
2
Country 3
OP/TRNG
1
1
1
1
1
1
1
1
1
Country 3
Ov erall
1.3
1.3
1.0
1.3
1.3
1.3
1.0
1.7
1.3
Figure B.1
Sample Excel Spreadsheet
Country “1” Assessment
1. Maneuver Doctrine:
Doctrine covers full spectrum of combined arms, joint, and combined operations. Country 1 has traditionally emphasized defensive operations, although the latest national military strategy emphasizes offensive, powerprojection missions. Their ground forces are centered around the brigadesized maneuver unit, and follow U.S. doctrine. Score: 3
Technology:
Country 1’s maneuver equipment is comparable with U.S. standards for hardware. However, they have not completed recent “software” upgrades in communications and targeting. Score: 2
B-1
Op. Readiness
Maneuver forces are capable of conducting fast-paced operations in different terrain. However, Country 1’s forces are not suited for operating in a WMD environment—they lack adequate NBC equipment and protection & decontamination capabilities.
Score: 2 2. Fire Support Doctrine
Plans for wide use of long-range massed fires, both organic and joint, to support both offensive and defensive operations. Maneuver-unit (brigade) commander has control over tactical firepower assets. score: 3
Technology
A recent ATACMS acquisition program will significantly increase their technological capability to fight deep. They utilize both self-propelled and towed artillery compatible with US standards. Country 1’s artillery units have effective fire control centers. However, due to lack of upgrades in software they are limited in their real-time targeting capabilities. Score: 2
Op. Readiness
Country 1 is effective in controlling fires for the close battle. Their ability to coordinate joint fires remains less than adequate and organic deep assets have yet to be integrated. Score: 2
3. Air Defense Doctrine
Country 1’s doctrine develops a comprehensive set of fire control measures, compatible with U.S. standards. Score: 3
Technology
No deployable Theater Missile Defense (TMD) system. Country 1 possesses organic Air Defense (AD) assets, requiring laser-target designators. IFF based primarily on visual recognition. Score: 2
Op. Readiness
Limited IFF capability hampers their training and operational readiness Score: 2
4. Mobility/Survivability/Force Protection Doctrine
Doctrine covers full spectrum of mobility/countermobility, but force protection is less developed. Score: 2
B-2
Technology
Substantial and modern engineering equipment with high mobility. Inadequate NBC protection & decontamination equipment, as well as lack of TMD, weakens survivability. Score: 2
Op. Readiness
Fully capable of conducting engineer activities across the battlefield. However, lack of adequate force protection and NBC training severely decreases their readiness in this area. Score: 2
5. Logistics Doctrine
Technology
Op. Readiness
“Pull“ logistics doctrine is a legacy from this country’s traditionally defensive orientation. Recent reforms in doctrine place increased emphasis on a “push” and joint approach. Country 1 can provide the full range of supply classes and required services to sustain its force for regional operations. Doctrine sets high maintenance standards. Score: 2 Despite recent acquisitions and equipment upgrades to enhance deployability, Country 1 continues to suffer from low mobility of its logistics assets. They have not fully developed a capability to support rapid offensive operations or strategic deployments and continue to rely primarily on manual tracking system for logistics. Score: 1 Limited capabilities to support strategic deployment. The logistic units of Country 1 have not re-configured fully to provide both robust and economical organic support to the combat forces and sound higher echelon support to reinforce, replenish, and re-supply the organic capability. Cutbacks in defense spending have led to a decrease in overall maintenance standards, and low stockage levels of repair parts. Score: 1
6. Deployability Doctrine
Country 1 began to develop doctrine on strategic deployments in 1995. Current doctrine calls for rapid force projection and the ability to conduct protracted operations abroad. Score: 2
B-3
Technology
They have pursued a vigorous program to reorganize and equip their forces to meet doctrinal requirements for force projection. Recent acquisition programs have given Country 1 the technological capability to rapidly self-deploy a brigade-size force in support of strategic operations anywhere in the world. Score: 2
Op. Readiness
While Country 1 is doctrinally and technologically prepared to self-deploy for operations abroad, its forces require significant training and selfdeployment functions such as strategic lift planning and RSO&I. The limited ability to provide critical supply functions hampers Country A’s ability to sustain forces deployed outside of the region. Score: 1
7. Command and Control Doctrine
Battle command doctrine is well-developed—it emphasizes flexibility of command and is based on decentralized command structure with decisionmaking authority delegated to subordinate commanders and staffs. Staffs from the tactical to the strategic level conduct operations planning based on the command estimate process. Score: 3
Technology
Commanders and staffs have relatively modern C4I assets to control the battle (gather information, receive guidance, process and disseminate operational plans). Score: 2
Op. Readiness
Country 1’s command and control system has been successfully tested in multinational and joint operations, but concerns have been voiced over its ability to withstand exposure to high-intensity conflict. Score: 2
8. Communications Doctrine
Orders formats are based on U.S. doctrine. While most personnel are not fluent in English, Country 1’s doctrine calls for the extensive use of welltrained liaisons to interface with U.S. forces. Score: 2
B-4
Technology
Military architecture for C4ISR includes reconnaissance satellites and battlespace command and control systems for ground forces (the joint aspect is not as well developed). Communications systems are digital and compatible with U.S. systems. While Country 1 has secure communications capabilities, it has only a limited ability for secure communications with U.S. networks. Score: 2
Op. Readiness
Country 1 employs a substantial number of liaisons, who are fluent in “US Military English” and knowledgeable on U.S. doctrine. Its communication system is fully functional for battlefield operations— reliable and quickly deployable. However, modifications are required for full interoperability with U.S. Score: 2
9. Intelligence Doctrine
Plan to use full range of intelligence capabilities in both joint and combined operations. Well-defined procedures for collection, analysis and dissemination (e.g., IPB). Score: 3
Technology
Broad and sophisticated array of intelligence collection and analysis (HUMNIT, SIGINT, ELINT). Country 1’s intelligence sharing network is compatible with the US; it relies on platforms similar to ASAS Warlord workstations. Score: 3
OP. READINESS
Based on recent exercise analysis, country 1’s armed forces are capable of conducting the full range of intelligence operations at the combined arms level. Joint intelligence sharing has only recently been established as a priority, and is therefore lacking. score: 2
B-5
Sample MCAT Scorecard Analysis The following figures B.2 and B.3 present the final results of an MCAT analysis. Figure B.2 demonstrates the data as submitted onto the worksheet. From this view, specific weaknesses in CAAs or CMs for a single country become obvious. Note that Country 2 has a critical weakness in deployability compatibility. Country 1, while strong in Doctrine, requires some long-term fixes in the Operational Readiness and Training realm. ##
0
Date of Analysis:
3
Color Blank Cells
BATTLEFIELD FUNCTIONS
High Color 3 Value 0 Low 1 Color Value
COUNTRY
Maneuver 1
CSS
C4I
Air Defense Mob/Surv Logistics 1
1
1
1
Wt or
Deployability
C2 1
Commo 1
Intel 1
1 Wt Wt Wt
Goal -->
CM
100
100
100
100
100
100
100
100
100
100
Compatibility Score Column 0
Flag.
Graph
No.
CAAs
Fire Support
1
1
Country 1
Doctrine
1
3
3
3
2
2
2
3
2
3
2.6
1
2
Country 1
Technology
1
2
2
2
2
1
2
2
2
3
2.0
1
3
Country 1
OR/TRNG
1
2
2
2
2
1
1
2
2
2
1.8
1
4
Country 1
Overall
1
2.3
2.3
2.3
2.0
1.3
1.7
2.3
2.0
2.7
2.1
1
5
Country 2
Doctrine
1
2
2
1
1
2
1
2
2
1
1.6
1
6
Country 2
Technology
1
2
1
1
1
1
0
2
1
1
1.1
1
7
Country 2
OR/TRNG
1
2
1
1
1
1
0
2
1
1
1.1
1
8
Country 2
Overall
1
2.0
1.3
1.0
1.0
1.3
0.3
2.0
1.3
1.0
1.3
1
9
Country 3
Doctrine
1
1
1
1
1
2
2
1
2
1
1.3
1
10
Country 3
Technology
1
2
2
1
2
1
1
1
2
2
1.6
1
11
Country 3
OR/TRNG
1
1
1
1
1
1
1
1
1
1
1.0
1
12
Country 3
Overall
1
1.3
1.3
1.0
1.3
1.3
1.3
1.0
1.7
1.3
1.3
Figure B.2 – Completed MFC Scorecard
B-6
Manipulation of the MCAT flags creates the MFC scorecard demonstrated in figure B.3. By listing the countries hierarchically in terms of compatibility, this scorecard presents the data in a form conducive to cross country comparisons both in overall compatibility and across the compatibility measures of doctrine, technology and operational readiness and training. Note that the “Compatibility Score” column to the far right of the scorecard shows the average score while the remaining columns allow for comparison in individual CAAs. ##
0
Date of Analysis:
3
Color Blank Cells
BATTLEFIELD FUNCTIONS
High Color 3 Value 0 Low 1 Color Value
COUNTRY
Maneuver
CSS
C4I
Air Defense Mob/Surv Logistics
Wt or
Deployability
C2
Commo
Intel
1
1
1
1
1
1
1
1
100
100
100
100
100
100
100
100
1 Wt Wt Wt
Goal -->
CM
100
100
Compatibility Score Column 0
Flag.
Graph
No.
CAAs
Fire Support
1
4
Country 1
Overall
3
2.3
2.3
2.3
2.0
1.3
1.7
2.3
2.0
2.7
2.1
1
12
Country 3
Overall
3
1.3
1.3
1.0
1.3
1.3
1.3
1.0
1.7
1.3
1.3
1
8
Country 2
Overall
3
2.0
1.3
1.0
1.0
1.3
0.3
2.0
1.3
1.0
1.3
1
1
Country 1
Doctrine
2
3
3
3
2
2
2
3
2
3
2.6
1
5
Country 2
Doctrine
2
2
2
1
1
2
1
2
2
1
1.6
1
9
Country 3
Doctrine
2
1
1
1
1
2
2
1
2
1
1.3
1
2
Country 1
Technology
1
2
2
2
2
1
2
2
2
3
2.0
1
10
Country 3
Technology
1
2
2
1
2
1
1
1
2
2
1.6
1
6
Country 2
Technology
1
2
1
1
1
1
0
2
1
1
1.1
1
3
Country 1
OR/TRNG
0
2
2
2
2
1
1
2
2
2
1.8
1
7
Country 2
OR/TRNG
0
2
1
1
1
1
0
2
1
1
1.1
1
11
Country 3
OR/TRNG
0
1
1
1
1
1
1
1
1
1
1.0
Figure B.3 – MFC Scorecard Demonstrating Ranking by CMs and Overall Compatibility
B-7
B-8