Stakeholders broke out of traditional roles while testing and evaluating cybersecurity at NIE 16.2, learning that when red and blue teams talk earlier and more often, cyber systems get stronger.
by Lt. Col. Jeff Strauss and Mr. Robert Wedgeworth
The ever-increasing complexity and interconnectivity of Army tactical networks and mission command systems, along with the requirement for mission assurance in the contested domain of cyberspace, present a unique challenge to operational testing and evaluation (T&E). The challenges in cyber (T&E) stem from several factors, first among them the sheer number of devices and the amount of data they exchange. These, when coupled with the growing size, evolution and complexity of software and the ever-present human factor risks, can make it seem nearly impossible to assess the true cybersecurity posture of our networks.
These challenges call for new and innovative ways to partner for success in cyber T&E, in fact a fundamental change in our traditional approaches. One such successful partnership was evident recently in the teaming of multiple organizations at Network Integration Evaluation (NIE) 16.2. During this event in May 2016, the stakeholders charged with developing, testing, fielding and ultimately operating and defending tactical networks and mission command systems took a fresh look at cybersecurity T&E paradigms, including the exchange of information.
These stakeholders included program managers (PMs) from the Office of the Assistant Secretary of the Army for Acquisition, Logistics and Technology (ASA(ALT)), along with testers from the U.S. Army Test and Evaluation Command (ATEC), the U.S. Army Research Laboratory (ARL), the U.S. Army Training and Doctrine Command (TRADOC) G-2, and the Threat Systems Management Office (TSMO). Army cyber defenders at the brigade, division and regional cyber center levels completed the team.
Cybersecurity T&E requirements are grounded in DOD Instruction 5000.2, “Operation of the Defense Acquisition System,” and other supporting regulations and directives. The primary purpose of cybersecurity T&E is to determine the operational impact of real-world cyber effects on the unit’s mission. The overall evaluation of a system’s cyber posture is a result of testing across the spectrum of developmental and operational environments, which typically follow the test-fix-test model. The operational test (OT) environment is the most complex and involves linking the system under test to the Soldier operators and defenders in an operational environment, including a representative cyber threat force.
CYBER TESTING STEP BY STEP
The first step to cyber testing during an OT event is a cooperative vulnerability and penetration assessment (CVPA). Cybersecurity professionals evaluate the system to uncover all potential vulnerabilities and threat vectors. The system technical experts, typically program office or field service representatives, and network defenders cooperate fully and work directly with ARL testers to perform a comprehensive assessment. The CVPA typically occurs weeks or months before the actual OT. The results of the CVPA are shared with defenders and owners of the system under test. Then cooperation begins to attempt to correct any cyber deficiencies before the next phase of testing.
The second cybersecurity test event is the adversarial assessment. This “assesses the ability of a unit equipped with a system to support its missions while withstanding validated and representative cyber threat activity.” Additionally, testers are chartered to “evaluate the ability to protect the system, detect threat activity, react to threat activity, and restore mission capability degraded or lost due to threat activity.” In NIE 16.2, cyber operators from the TSMO assumed this adversarial role, attempting to gain access, exploit vulnerabilities and create mission effects on the systems under test.
BLUE VS. RED
In a traditional OT environment, participants maintain a rigid separation of the test audience, known as the Blue Team, and the opposing threat forces, or the Red Team, to preserve the operational realism of the test event. In the cyber domain, this “firewalling” of the red and blue elements historically has led to disappointing and frustrating cyber assessments.
There are several challenges with this traditional model. The primary challenge is a lack of timely detailed feedback on the systems and the efforts to defend them; feedback typically is not available until well after all testing is completed. Without any dialogue among stakeholders, these OT events fail to achieve their full potential in uncovering system vulnerabilities and developing improvement strategies for detection and mitigation. While traditional tests typically achieve the goal of demonstrating the operational risk of cyber vulnerabilities, they fall short of the goal to actually improve prevention, detection and mitigation procedures.
Historically, OT cyber testing has revealed a consistent list of problems: default passwords, misconfigured hardware, poor user behavior and unpatched vulnerabilities. While this is important, much more can and should be learned from these rare opportunities to exercise cyber defense in a realistic environment. When cybersecurity OT finds only seemingly simple issues that surface routinely, it leads to frustration for decision-makers at every level.
The result of “firewalling” key players during cyber OT often results in the system’s PMs discovering the “bad news” far too late in the system life cycle, when making meaningful changes is more costly and time-consuming. The lack of real-time feedback was also a problem for principal decision-makers throughout the acquisition and T&E communities who desired more comprehensive exploration of cyberattack vectors and methods.
A DIFFERENT APPROACH
During NIE16.2, Brig. Gen. Kenneth L. Kamper, then commanding general of ATEC’s U.S. Army Operational Test Command, envisioned a different approach to cyber OT centered on teaming. “We have some very specific goals when it comes to cyber operation testing and protocols that need to be followed for good reasons, but we also ought to be using every opportunity to learn and get better every day,” said Kamper after the event.
Striking that balance was the goal of several partner agencies charged with the conduct of cyber OT at NIE 16.2. The central concept involved much more frequent and results-minded interaction between the red and blue elements. The assumption was that if the network defenders (Blue Team) were provided more information about how the cyber threat (Red Team) was behaving, then they would be in a much better position to prevent, detect, react to and ultimately defeat the cyber threat and restore systems. The result would be a more comprehensive assessment of the cybersecurity posture of systems under test during the condensed testing window of the 14-day evaluation.
The Blue Team met with the Red Team before the event and at the midpoint to discuss what each was seeing on the network. These formative discussions, while somewhat guarded to maintain a spirit of fair competition, were productive in ensuring that the teams were not overly focused on one aspect of the network and systems. At the end of the event, a much more robust and open technical exchange was conducted. This exchange, labeled the “Tech-on-Tech,” was analogous to the after-action reviews that are a staple of the combined arms training centers. Here, both red and blue teams discussed what their plans and actions were during each phase of the test event. The discussion allowed an immediate, in-depth analysis of the action-to-counteraction maneuvering on the network and resulted in lessons learned for both the defenders and those responsible for system engineering and design.
TECH-ON-TECH
A special feature of this exchange was the presentation of a codified assessment of defenders’ actions against the threat. This evaluation rubric outlined behaviors and criteria along a continuum of observed indicators from the viewpoint of the adversary. The Red Team essentially told the Blue Team how hard the Blue Team made each phase of the threat presentation based on discrete observations of the network security. The feedback from the event was uniformly positive. One observer from the Blue Team stated that he learned more during this event than from all previous NIEs combined. This positive response has prompted decision-makers to further explore and codify this concept for future NIEs and similar cyber test events.
While senior leaders in the test and PM communities push for more opportunities to partner closely in cyber T&E, they are also paying special attention to ensure the integrity and validity of operational realism. In planning future exchanges during OT, caution is warranted in data exchanges among developers, defenders and testers. It is critical that teams not mask system issues, and thus make system performance appear better in a test than it would actually be in a true operational situation, by exchanging too much information. Invalid testing could allow the fielding of substandard equipment, threaten our national security and ultimately cause loss of service members’ lives.
The stakeholders at NIE 16.2 did an excellent job of balancing this need to maintain threat integrity for the system under test with the desire to make systems better through collaboration. While these partnering events were not as robust as exchanges held during training events or Army Warfighting Assessments, they re-established the notion of “one team” and helped break down the “us vs. them” atmosphere that can inhibit positive exchanges and improvement in cybersecurity.
Ensuring that systems are ready for Soldier to rely on them on the battlefield remains the focus of operational testing, and these exchanges helped to meet that end. The Tech-on-Tech discussion, observed by PMs and developers, provided great insight into the test and how systems fared against a representative cyber threat. The content was much more technical than at previous events, covering specific software and hardware vulnerabilities and exploitations. During the final exchange, subject matter experts from the blue and red teams participated in focused discussions with system developers on how to thoroughly improve the system under test.
CONCLUSION
The initial feedback on these discussions has been very positive. Col. Greg Coile, project manager for the Warfighter Information Network – Tactical, praised the continued partnering initiative. “The insights we gained in near-real time of potential vulnerabilities in the network and applications enabled us to make rapid improvements to continue to harden the network,” Coile said after the event.
A post-test presentation of NIE 16.2 cyber findings, hosted by the Program Executive Office for Command, Control and Communications –Tactical (PEO C3T) after a more comprehensive analysis of the event results, discussed various source code and software features that could be modified to enhance security. This review looked at network diagrams and screenshots of trouble areas, among other analysis, and reinforced the spirit of partnership as developers, PM system engineers, various software testers, Red and Blue teams, and PM and PEO leadership worked together to better understand the cybersecurity posture and performance of the tested systems.
After the event, Nancy Kreidler, the information assurance program manager for PEO C3T, summed it up this way: “The follow-on technical exchange between the Red Team and our larger team of security engineers from the program offices was invaluable. It allowed our folks to look at vulnerabilities in a new light and get after some of these challenges in our labs.”
The unassailable truth about cybersecurity is that the discipline is evolving at a rate that challenges our current processes all along the spectrum of doctrine, organization, training, materiel, leadership and education, personnel, facilities and policy. If we are to have any chance to surmount this rapidly changing problem, we must be willing to challenge our own culturally entrenched ways of thinking about the problems and refuse to become moored to any idea that limits our overall ability to respond to change and accomplish valid and reliable testing. Partnership among all stakeholders is the key to tackling these difficult problems in a dynamic discipline.
For more information on how programs can succeed through increased partnering between the test and acquisition communities, or to request test team support, go to https://www.atec.army.mil/rfts.html.
LT. COL. JEFF STRAUSS in the senior acquisition adviser in the Survivability Evaluation Directorate of ATEC, with over 10 years of acquisition and T&E experience. He holds a master’s degree in cybersecurity policy from the University of Maryland, Baltimore County and a B.S in construction science from Texas A&M University. A member of the Army Acquisition Corps (AAC), he is Level II certified in program management and is a certified project management professional.
MR. ROBERT WEDGEWORTH is a threat cybersecurity operations test lead with TSMO under the Program Executive Office for Simulation, Training and Instrumentation. He has over 15 years of experience in the areas of information warfare and cyberspace operations. He has an M.S. in systems engineering (information warfare) from the Naval Postgraduate School and a B.S. in mathematics from Auburn University. He is level III certified in information technology and a member of the AAC.
This article is scheduled to be published in the April – June issue of Army AL&T Magazine.
Subscribe to Army AL&T News, the premier online news source for the Acquisition, Logistics, and Technology (AL&T) Workforce.