LESSONS LEARNED: Hacking and other malicious threats lurk online, stressing the importance of exceptional products to thwart these attacks. Contracts to develop these products can and will terminate if the product is deemed unacceptable. This shouldn’t be seen as a contracting failure, but rather, lessons learned on how to successfully pursue software development contracts in the future. (U.S. Army photo)
Lessons learned from a failed contract for all-Army CyberStakes.
by Maj. John Rollinson and Maj. Shane Kohtz
In February 2021, the Army Cyber Institute (ACI) at West Point requested the contracting officer at the U.S. Army Mission and Installation Contracting Command to pursue a termination for default on a contract to develop “novel challenges” for All-Army CyberStakes (AACS)—an annual cybersecurity competition. Because of the short timeline and COVID-related budget reductions, the termination meant there would be no competition that year. Although at first glance this is a contracting failure, we argue that aside from a risky timeline, the process worked mostly as it should (the Army did not pay for an unacceptable product). From this perspective, we offer our lessons learned on how to successfully pursue software development contracts in the future.
THE ALL-ARMY CYBERSTAKES CONTRACT
Hosting a cybersecurity competition of the size, duration and caliber of AACS is a significant undertaking. While the ACI has resident technical expertise, it does not have the depth of experience or personnel to produce the necessary content—50 self-contained, cybersecurity puzzles across six subject areas ranging in difficulty from tutorial-like to solvable by only the most capable of cyber experts. In contrast to other cyber-related training contracts, the content is the only part of the contract, and all material becomes the property of the ACI. The ACI then manages the IT infrastructure and underlying platform for the competition and serves the content using internal resources.
SOFTWARE DEVELOPMENT BEST PRACTICES
The Army continues to emphasize the importance of programs adopting industry best practices for software development. DODI 5000.87, “Operation of the Software Acquisition Pathway,” directed that “programs will require” software teams to use software best practices, such as Agile or lean development methods. The instruction also cleared DOD programs in the software acquisition pathway from following the legacy Joint Capabilities Integration and Development System requirement generation process to become less rigid and accelerate software development. The paradigm shifts in software acquisition approaches highlight the importance of agile and iterative software development; however, as the ACI experienced, the standard Federal Acquisition Regulation (FAR) contracting process remains a serial process that encourages legacy software development methods.
SETTING THE STAGE FOR SUCCESS
Two of the most critical phases for a software effort are acquisition planning and contract administration. Accurate cost estimates, iterative deliveries, logical contract structure and detailed data rights are inherent to a successful acquisition plan. The government technical experts play a vital role in evaluating a vendor’s proposal, deliveries and communicating issues during the contract administration phase. The following ACI lessons learned may assist acquisition professionals in managing the cost, schedule and performance of future software efforts.
Acquisition Planning:
The standard FAR-type contracts require deliberate planning up front, which is somewhat counter to the iterative approach necessary for software development. There are five lessons learned when planning for software design, development and testing.
- Break the independent government cost estimate (IGCE) into components as much as possible. An IGCE for a software development effort with varying degrees of complexity should estimate the amount of time and effort necessary to complete each software item, which is known as “work-based costing.” In ACI’s case, the cost estimate should have focused on each software challenge instead of a blanket cost estimate for a software team developing for a block of time known as “level of effort costing.” The approach assists in the comparison of our assumptions about cost with submitted proposals.
- Require the vendor to deliver early and deliver often. The high frequency allows for more objective monitoring of progress and product quality, and the iterative approach will prevent significant rework before final delivery.
- Sequence each capability or feature as a separate contract line-item number (CLIN) in the contract (with appropriate quantities if multiples of the same “capability” are required, such as three binary challenges of medium difficulty, as well as breaking out proposed challenge descriptions). This gives the government additional flexibility and ensures our costs accurately reflect what we are receiving.
- Ensure the deliverables (CLINs) are priced logically in a fixed-price contract. The CLINs on the AACS contract had the same price despite different levels of complexity for each development phase.
- Clearly identify data rights for every software development effort. The government maintains unlimited rights of data (software code) the vendor develops for the first time on contract (Part 27 – Patents, Data and Copyrights | Acquisition.GOV). Government contractual expectations for data are critical for final delivery or potential termination.
Contract Administration:
ACI noticed significant issues during the contract administration phase, which eventually led to contract termination. Four administration actions are necessary for success on a software development contract.
- Proposal technical screening is a critical activity for a successful software contract. The ACI screened out one company that would have clearly been out of its technical depth, but the chosen contractor gave good answers to key technical questions. In particular, challenge “lazy” pricing schedules (every CLIN proposal price was the same despite variable costs and difficulty). Major discrepancies between IGCE and proposed pricing most likely reflect misunderstandings or differences in underlying assumptions about the requested product.
- “Trust but verify” is critical and requires a technical representative with adequate time and resources to review the contract deliverables. Some technical red flags to look for: failure to read the contract (deliverables in wrong format); vendor methodologies or technical approaches inappropriate for the problem; immature internal processes that spill into the contract relationship (claims of “miscommunication,” “confused” deliveries, passing blame to others, etc.).
- Trust but verify again, again and again. The software world is complex and copyright violations occur, as the ACI observed during the development phase. It is important to check compliance with licensing terms of the components and dependencies of the delivered software. The AACS software vendor had issues of not just plagiarism from unlicensed sources, but several instances of failing to comply with the terms of open source licenses such as providing the exact source code used (GNU General Public License) and failing to include mandatory acknowledgments (Apache License, Version 2.0). Vendor use of outside code, after making statements of internal development, is an immediate red flag and must be taken to the contracting officer.
- Good communication—keeping the contract officer, contract officer representative and any technical advisers on the same page is critical, both to ensure we pay promptly for correct deliverables and for ensuring performance concerns are articulated accurately (from a technical perspective) and correctly (from a contract law perspective).
WHEN THINGS GO WRONG
Drive toward the idea that a “failed contract” is not a “contracting failure” when dealing with software. Software development fails—a lot—so this potential outcome is almost an expectation for software contracts. The government should not only be comfortable with terminating a contract and restarting it, but also should be approaching every software contract with this as an expected part of both the time and final cost estimate. Software development iterations are a necessary reality, and the ACI recommends the following steps when things go wrong.
- Hold to the contract, particularly deadlines. Software estimates are notoriously inaccurate and early missed deadlines are likely to compound, not disappear.
- Talk early and often across the government side of the contract team. Early issues will expand and create more issues that will likely be more costly and time consuming as the contract progresses.
- Know the contracting and software development processes and be prepared to execute a rebid or award to next bidder.
- Do not pay for substandard work, which is beneficial as it keeps the money available for getting the product done correctly.
- Salvage and mandate delivery of any data from the vendor after contract termination. Discretion lies with the contracting officer for material delivery after termination. Even though the AACS did not occur, the vendor developed some challenges that would have been useful ACI training aids, if delivered.
CONCLUSION
As counterintuitive as it may seem, ACI’s largest lesson from this failed contract is that software contracts must be written for failure because the software will “break” and have issues. This means changing government cultural expectations and understanding software acquisitions will often need more than one attempt. The government must hold contractors accountable for the quality and timing of deliverables and remain flexible to rapidly switch contractors when there is a failure to meet contractual expectations. To move in that direction, the contracting team and customer must communicate clearly and work together throughout the process to ensure the project is successful—even when an individual contract award is not.
For more information, contact Maj. John Rollinson or Maj. Shane Kohtz at the Army Cyber Institute at john.rollinson@westpoint.edu or shane.kohtz@westpoint.edu.
MAJ. JOHN ROLLINSON holds an M.S. in computer science from Carnegie-Mellon University and is currently serving as a research scientist at the Army Cyber Institute at West Point, New York.
MAJ. SHANE KOHTZ holds an MBA from the Naval Postgraduate School with a focus on systems acquisition management. He is a member of the Army Acquisition Corps, holds a DAWIA Practitioner certification in program management and currently serves as a cyber research manager at the Army Cyber Institute at West Point, New York.