Killing the ‘Creep’

By July 27, 2015September 4th, 2018General

JLTV’s competitive prototyping effort fills a gap in the light tactical vehicle fleet while preventing test creep by bringing together stakeholders to eliminate unplanned, unfunded requirements

By LTC Misty L. Martin, Ms. Danielle Wayda, Mr. Steve Martin and Mr. Josh

The Joint Light Tactical Vehicle (JLTV) program is one of the first to implement DOD’s competitive prototyping policy. Established in 2007, the policy stipulates that two or more competing original equipment manufacturers (OEMs) must produce prototypes to reduce risk, maximize performance, decrease costs and synchronize requirements. Simply put, this means that in addition to the normal test objectives and issues, all JLTV OEM vehicle prototypes were required to be tested consistently, fairly and separately.

The JLTV program’s engineering and manufacturing development (EMD) phase concluded in late 2014 after an aggressive, 14-month test schedule specifically intended to generate data sufficient to inform the Source Selection Evaluation Board (SSEB) and the capability production document (CPD) development, and provide data for the Milestone C (MS C) decision. Each of the many testing categories contained numerous subtests addressing requirements and required diligent management to avoid costly “test creep”—unplanned and unfunded test data requirements identified after the start of test execution. Simply put, test creep adds risk—in cost, schedule and performance—to programs and can delay or even end an otherwise successful program. Successfully avoiding these impacts to the program’s tight schedule and budget required detailed planning and budgeting, careful management and control, and constant communication with a diversity of stakeholders: the U.S. Army Test and Evaluation Command (ATEC); the Office of the Secretary of Defense’s director of operational test and evaluation; the deputy assistant secretary of defense for developmental test and evaluation; the Marine Corps Operational Test and Evaluation Activity (MCOTEA); and Army and Marine Corps combat developers. The JLTV program yielded a number of lessons learned that will be shared with other programs employing a competitive prototyping strategy.

The tight schedule and extensive EMD testing, combined with heel-to-toe vehicle testing, required JLTV Product Manager for Test (PdM Test) team to thoroughly understand when to push back on test creep. For example, simply asking what the program and stakeholders would gain by conducting more testing, and then showing the corresponding low return on investment was sometimes what stood between staying on schedule and under budget, and creating program schedule and cost overruns.

There were instances when test creep was a reality, and no amount of discussion could put or keep it at bay. Weeks of assertive back-and-forth dialogue on the test and evaluation master plan (TEMP) were spent on “in the weeds” details. The program office’s position emphasized that those details should have been included in a detailed test plan (DTP), as opposed to the TEMP, in which they are considered binding regardless of any risk-benefit analysis.

The key to this effort focused on striking a balance and obtaining stakeholder buy-in to what is too much, which could cause program failure resulting from restrictive wording, and what is not enough, which could cause future funding issues. Requirements management is the program’s foundation. This foundation must be rock-solid without allowing test creep to erode it.

ACQUISITION TRAILBLAZER

ACQUISITION TRAILBLAZER The JLTV program is one of the first to implement DOD’s competitive prototyping policy, which stipulates that two or more competing OEMs must produce prototypes to reduce risk, maximize performance, decrease costs and synchronize requirements. (Image courtesy of the Program Executive Office for Combat Support and Combat Service Support)

EMD PLANNING
In planning for JLTV EMD, the program team had to clearly understand what we were providing the warfighter and the risks associated with building it. Each test category contained numerous subtests addressing specific requirements. For example, automotive performance testing included soft-soil mobility, sand-slope traversing, braking, steering and handling, ride quality, fording, fuel consumption, top speed, acceleration, grades and slopes, as well as several other tests. Ballistic testing required additional test assets at the subsystem level (nine armored chassis plus numerous armor coupons or armor samples) in addition to the 27 system-level test assets included in the 66 test assets overall.

Test planning and DTP development were a several-month endeavor that involved multiple draft revisions, requiring weekly (and often daily) communication between the test-site subject-matter experts and our PdM Test team.

PdM Test emphasized a collaborative effort among JLTV PdM Test, ATEC’s U.S. Army Evaluation Center (AEC), MCOTEA and the various test sites, which ensured an appropriate balance between adequately testing the requirement and over-testing.

AEC’s data source matrix (DSM) defined the data that AEC and MCOTEA (the evaluators) needed to assess the JLTVs’ effectiveness, suitability and survivability. As such, DTP development focused on the testing needed to provide this data. The JLTV purchase description added test data requirements above the DSM, as it was determined to be critical for the SSEB. Subsequently, those requirements were also included in the DTPs.

This process also ensured that all stakeholders shared a common understanding of the test procedures, mitigating test creep caused by miscommunication on how requirements were to be tested. Given test program cost and schedule constraints, the Joint Program Office (JPO) JLTV, in coordination with stakeholders, determined which requirements did not need to be tested and could be evaluated through other means. Therefore, a requirements prioritization based on DSM data needs, as well as CDD-driven tier-level criteria (e.g., key performance parameters, key systems attributes versus others) provided guiding factors.

Once the DTP drafts were complete, it was imperative that other stakeholders, such as systems engineering, logistics and the JPO product directors—organ­izations responsible for managing each respective EMD OEM—review the documents to ensure that their respective concerns were addressed.

Test execution, to a greater extent than test planning, required constant interaction of the JPO JLTV, PdM Test, logistics, the budget management office, systems engineering, product directors and the test sites. Daily test update briefs (TUBs) and daily written test status reports ensured that all stakeholders were aware of current test status, which enabled timely identification and mitigation of test-related issues. Weekly test-site test-completion updates were also an important element in managing the test schedule.

THOROUGH ANALYSIS

THOROUGH ANALYSIS JLTV’s PdM Test team records vehicle weight during the limited user test. The program’s EMD phase concluded late last year after an aggressive, 14-month test schedule, with 300 test team members collecting data at 17 test sites. (Photo courtesy of JPO JLTV)

THREAT NEUTRALIZED
Efforts to minimize test creep began early in the test planning stages. By limiting testing to those test events needed to produce data to satisfy DSM needs, the JPO reduced extraneous testing from the test plans. PdM Test monitored test progress (versus schedule) on a daily basis, which enabled decisions regarding retest of failed items after corrective actions were implemented versus adhering to the test schedule and proceeding to the next test event. These were typically case-by-case decisions dependent on several factors based on priority (requirement priority, e.g., key performance parameter or not, test duration, etc.) The TUBs ensured that everyone, including PdM Test leadership, had all of the facts before providing guidance.

Ballistic testing was one of the test program’s big success stories in demonstrating how early planning eliminated any push for test creep. The team successfully reduced testing by understanding requirements and worked closely with the live fire integrated product team to reduce shots where OEM designs made reductions feasible. This abbreviated the test schedule and reduced cost. Deferring certain testing to the low-rate initial production phase, to be conducted on a single OEM, resulted in additional cost avoidance.

CONCLUSION
The JLTV EMD phase’s success can be attributed to open communication within the program office and among all stakeholders, a solid understanding of the risks the JLTV program faced, constant risk management and mitigation, and test-creep control. PdM Test successfully achieved EMD test phase objectives, ensured that requirements were tested, and provided the program with the necessary data to support the SSEB, CPD development, and ultimately, the JLTV MS C decision. PdM Test successfully managed and oversaw the execution of a complex test program that enabled implementation of the competitive prototyping policy with all three OEMs, all while remaining on schedule and under budget. The JLTV program promises to yield a number of lessons that can be leveraged by similar programs with a competitive-prototyping strategy.

Those same principles must be applied in JLTV’s next phase. We learned from the last phase that we cannot buckle to each want and whim, as doing so can be detrimental to the program. We must consider and balance each request and maintain constant awareness of the planned end state. Late-game test creep will only slow or halt what has been, to date, a very successful program.

Test programs cannot be developed without planning, budgeting and communication. Once developed, they must be managed, constantly communicated and controlled. Test creep cannot be allowed to create havoc; testing must be conducted to address a specific requirement and must consider risk.

PdM Test and the JPO JLTV (consisting of engineers, logisticians and quality assurance, budgeting and contracting personnel) are reviewing and assessing the EMD phase test results to better understand areas of performance risk, and will provide ATEC with recommendations to improve test efficiency and effectiveness. PdM Test’s goal is to maximize test efficiency and effectiveness during the production phase by eliminating redundant testing—analyzing risk and target tests accordingly—as well as employing test design techniques to ensure efficiency in producing statistically significant and defensible test results. EMD phase lessons learned in all functional areas within the JPO JLTV will be carried over into the production phase beginning in this fiscal year’s fourth quarter to ensure successful program execution.

THE RIGHT CHOICE

THE RIGHT CHOICE To test three prototypes during the JLTV program’s EMD phase, Soldiers and Marines participated in a training exercise facilitated by the U.S. Army Operational Test Command on Fort Stewart, GA, in October 2014. (Photo courtesy of JPO JLTV)

For more information, go to http://www.peocscss.army.mil/.


LTC MISTY L. MARTIN is the PdM for test, JPO JLTV. She holds an M.A. in defense management and B.A. degrees in psychology and sociology. She has served in several ground vehicle assignments, including with the Project Management Office for Stryker as the assistant PdM for command, control, communications, computers, intelligence, surveillance and reconnaissance and as the PM forward in Afghanistan, and with the U.S. Army Special Operations Command as the Special Mission Units systems acquisition manager for weapons and vehicles. She is Level III certified in program management and Level I certified in test, systems planning, research, development and engineering (SPRDE) and science and technology, and is a member of the Army Acquisition Corps (AAC).

MS. DANIELLE WAYDA is the senior test lead within PdM Test for JPO JLTV. She holds an M.S. in engineering management from Oakland University and a B.S. in mechanical engineering from Lawrence Technological University. A member of the AAC, she is Level III certified in SPRDE and Level II certified in test and program management.

MR. STEVE MARTIN plans and executes testing events as the JPO JLTV Army developmental test/operational test (DT/OT) test lead for PdM Test. He holds an M.S. in hazardous waste management from Wayne State University and a B.S. in engineering chemistry from Oakland University. He is Level III certified in SPRDE, Level II certified in test and Level I certified in production, quality and manufacturing. He is also certified as a Quality Engineer by the American Society for Quality.

MR. JOSH PAGEL provides contract support for Booz Allen Hamilton Inc. and currently supports JPO JLTV as the DT/OT test engineer. He holds an M.E. in mechanical and aerospace engineering from the University of Virginia and earned a B.S. in mechanical engineering from the University of Michigan. He has spent more than 17 years supporting Army and Marine Corps tactical ground vehicle development, with nearly 10 of those years in the test and evaluation field.


  • Disclaimer: Reference herein to any specific commercial company, product, process or service by trade name, trademark, manufacturer or otherwise does not necessarily constitute or imply its endorsement, recommendation or favoring by the U.S. government or the DA. The opinions of the authors expressed herein do not necessarily state or reflect those of the U.S. government or the DA, and shall not be used for advertising or product endorsement purposes.

This article was originally published in the July – September 2015 issue of Army AL&T magazine.

Access AL&T is the premier online news source for the Army Acquisition Workforce.
Click to Subscribe