Standardized Measures of Performance Framework enables consistent assessment of Army network capability
By Mr. Michael Badger, Dr. Dennis Bushmitch, Mr. Rick Cozby and Mr. Brian Hobson
The Army’s adoption of the Agile Process to enable rapid technology insertion led the three agencies charged to execute this process—the U.S. Army Test and Evaluation Command (ATEC), the U.S. Army Training and Doctrine Command (TRADOC) Brigade Modernization Command (BMC) and the Assistant Secretary of the Army for Acquisition, Logistics and Technology (ASA(ALT))—to organize as the TRIAD and develop the needed measurement framework.
The TRIAD intended that the measurement framework would establish consistent, reusable, traceable, standardized performance and effectiveness metrics across the Agile Process. More specifically, the TRIAD envisioned that this framework would preserve resources and reduce risk in planning and executing the culminating activity of the Agile Process, a Network Integration Evaluation (NIE).
The testing of complex networks and their capabilities can be time- and resource-intensive, with minimal potential to reuse the test event’s capability. Testing without well-defined analytic objectives and repeatable measures of performance (MoPs) can waste time and money. Furthermore, without an Armywide objective standard for test and evaluation (T&E) metrics, the results will be less than compelling for senior decision-makers. Different organizations supporting the Agile Process and NIE events often misinterpret, inappropriately apply or reinvent the current set of network-related MoPs for each application (e.g., a T&E event).
The complex system-of-systems (SoS) solutions that comprise the Army’s network demand a measurement framework with traceable and credible measures, encompassing the interaction among various network layers; command, control, communications, computers, intelligence, surveillance and reconnaissance (C4ISR) systems; and the technical requirements that underpin them. Beginning with the FY12 NIE events, an enduring MoP Framework emerged as a potential solution standard, developed by ASA(ALT), ATEC, BMC, the federally funded research and development center MITRE Corp., and subject-matter experts (SMEs) from the Program Executive Office Command, Control and Communications – Tactical (PEO C3T).
The MoP Framework, which the TRIAD has used successfully and has matured during the planning and/or execution of five NIEs, achieves the following:
- Standardizes the terms of reference for each individual MoP and its application.
- Defines instrumentation considerations and practices in support of MoPs.
- Enables organizations using the MoPs to establish traceability to credible source documentation (operational and analytic requirements).
- Allows organizations to determine the gap(s) in MoP availability, application maturity and definition in a visual manner through the use of graphics.
- Allows organizations to re-prioritize the MoPs within each graphical representation according to analytic engineering or T&E requirements.
- Allows simple, graphical communication of T&E and analytic requirements among organizations from an operational perspective and at multiple levels (system, SoS, mission command tasks and operational effectiveness).
- Standardizes the units of measurement.
- Mitigates the errors in interpretation, instrumentation, and data collection, reduction and analysis approaches.
The key new concept introduced in the enduring MoP Framework is called a MoP map.
Figure 1 represents such a map for an operational capability category and subcategory. (See definitions in Figure 2) Figure 1 also illustrates the inclusion and alignment of various reference attributes, such as layers, data types and source MoPs. SMEs and organizations create and tailor different MoP maps for different operational capability subcategories, systems and/or SoSs within a subcategory.
The vertical axis of the MoP map relates top-level mission effectiveness MoPs to lower-level waveform, spectrum and radio frequency (RF) MoPs. The horizontal axis relates operational mission threads, applications, information exchanges and data types within a given system or SoS operational capability category. The attributes along this horizontal axis allow for MoP alignment to a variety of mission threads (i.e., call for fire); applications and information exchanges (i.e., message type); and data types (i.e., voice and video).
The MoP Framework employs several reference attributes to support the standardization and traceability of requirements. These attributes, as Figure 2 shows, correlate to credible operational capability categories and subcategories, align to layers of user application, are traceable to data types, and feature a source reference set of credible and established metrics. The MoP map accomplishes the following functionality:
- Aligns MoPs to operational capability categories and subcategories, enabling credible application to operational systems.
- Maps MoPs to user application layers, allowing flexibility.
- Enables traceability of MoPs to application data types, enabling their reusability and completeness across operational capabilities.
- Aligns credible, applicable and reusable metrics, increasing efficiency across a user community from multiple organizations
- Establishes relationships among different MoP maps by cross-referencing graphical tools
- Provides a powerful graphical representation tool for traceability to the parent operational requirement and MoP
- Provides a simple reference scheme for easy identification and traceability of MoP types, the MoP system layer and the operational capability type.
- Establishes and standardizes definitions and units of measurement.
The MoP Framework developers identified, developed and defined a set of operational capability areas that encompass the potential system—Capability Set (CS), System Under Test, System Under Evaluation and network capabilities envisioned as part of the Agile Process. Figure 2 defines these operational capability areas and categorization, and depicts a unique numbering schema for each subcategory to preserve originality and allow for traceability.
The intent of these defined operational capability categories is to align operational gaps with projected needs and requirements into operational capability categories, and to establish, define and employ consistent, credible and reusable metrics. These metrics, in turn, inform and characterize the performance and effectiveness of operational capability to satisfy defined requirements. Because these metrics have different attributes that they must align to and support, the MoP maps were developed with three different attribute alignment considerations: network layers, data types and MoP sources, as follows:
Network layers—Layering is an accepted approach to focusing and constraining the complexity in technical network analysis. The complete set of MoP Framework layers include: mission effectiveness; mission threads; application; Common Operating Environment (COE)/security; network routing/quality of service; network transport; waveform; and spectrum/RF. The vertical axis of “layering” in the MoP Framework in Figure 1 has evolved and matured through application to include high-fidelity measurement needs at the bottom of the axis (i.e., spectrum, RF and waveform), transitioning to lower-fidelity measurement needs at the top of the axis (i.e., mission effectiveness and mission threads).
Data types—As depicted in the generic MoP Framework, several data types within each operational capability subcategory could apply to different MoPs. The horizontal axis in Figure 1 relates the various operational mission threads, applications, information exchanges and data types toward one another within a given system or SoS category. The traceability of MoPs within data types between different operational capability subcategories allows analysts to cross-reference MoP maps.
Measures of performance sources—In developing the MoP Framework and the individual MoP maps, the TRIAD leveraged a body of work led by the TRADOC Analysis Center to identify a framework for Agile Process analytic requirements. (See Figure 3.) This analytic framework established a hierarchy of operational issues and essential elements of analysis, allowing for a credible and traceable source of MoPs.
Figure 4 shows the application of the MoP Framework methodology to the Mission Command (MC) Display Hardware operational capability subcategory.
As depicted in Figure 5, the performance MoPs are predominantly in the area of SoS operational issues. Figure 5 also depicts the evolving and maturing capability of the MoP Framework maps, as the MoPs for the COE/security layer have yet to be developed and coordinated.
Each MoP has a unique number. This numbering schema allows analysts and evaluators to leverage the MoP Framework for MC Display Hardware and import the information to event- or system-specific data source matrices, while still maintaining the traceability and origin of these MoPs.
By identifying and aligning MoPs for each operational capability subcategory, the MoP Framework provides credible and traceable metrics for analysts that are reusable across Agile Process activities and between organizations in support of a particular application (i.e., event). This reusability is based on repeated application of operational capability and the repeated need to measure operational performance and utility.
The standardization of a MoP Framework Armywide will promote cost avoidance by reducing the re-creation of testing objectives and streamlining instrumentation planning. The implementation of a unified MoP Framework will also give greater validity to the operational relevance of testing. Analytic requirements exchanged between organizations using this standardized construct provide for clear cost-evaluation guidelines, prioritization and traceable evaluation.
For more information, please contact Dr. Dennis Bushmitch (email@example.com, 410-322-2054) or Mr. Brian Hobson (firstname.lastname@example.org, 913-544-5101).
MR. MICHAEL BADGER is a senior network engineer for PEO C3T. He holds a B.S. in mechanical engineering from the Rutgers College of Engineering and an MBA from Monmouth University. He was a resident senior executive fellow of the Harvard Kennedy School of Government in 2010. Badger is Level III certified in systems planning, research, development and engineering (SPRDE) – systems engineering and is a member of the U.S. Army Acquisition Corps (AAC).
DR. DENNIS BUSHMITCH is an inventor and prolific technical author, and has been a chief analyst for several Army programs. He holds an M.S. and Ph.D. in electrical engineering from the Polytechnic Institute of the New York University. He is Level III certified in SPRDE – systems engineering and is a member of the AAC.
MR. RICHARD “RICK” COZBY is the deputy director for SoS engineering and integration within the Office of the ASA(ALT). He holds a B.E. in electrical engineering from Vanderbilt University, an M.S. in administration from Central Michigan University, and an M.A. in management and leadership from Webster University. He is Level III certified in program management and in test and evaluation, and is a member of the AAC.
MR. BRIAN HOBSON is a senior analyst, senior program manager and deputy director for Trideum Corp., Huntsville, Ala.. He holds a B.S. from the United States Military Academy at West Point and an M.S. in operations research from the Air Force Institute of Technology. He is a lifetime member of the International Test and Evaluation Association and the Military Operations Research Society.
Contributing to this article were Mr. Vince Baxivanos, Ms. Christina L. Bouwens, Dr. Melanie Bragg, Dr. Nancy M. Bucher, Ms. Karen Drude, Ms. Diane Eberly, Mr. Derek Erdley, Mr. Na Gaither, Mr. Omar Gutierrez, Dr. John Harwig, Mr. Anthony W. Harriman, Mr. Michael S. Jessee and Dr. Chris Morey.