SHIFT LEFT
an objective to accomplish, and the test plans are tailored to meet the data and decision-making needs of all users. A single planning procedure is necessary to ensure that all decisions and data can be combined as well, so as not to mix apples and oranges. Procedures must also incorporate the decision-making process to account for test outcomes that will require modifying future steps in the test process.
Test execution—Both agencies must agree in advance what they will do while executing the testing and, most important, what they will do when testing reveals something unexpected (higher- or lower-than- expected performance, or a failure). For example, the vendor may want to demonstrate a capability such as top speed, whereas the government wants statistical assurance of the same metric, which may require more samples. Additionally, the government may want to look at the top speed as the system gets older to see how time and usage affect this capability.
Reporting—Reporting could be one of the easiest aspects to combine between organizations. But again, how data support the parties’ deci- sions may tailor the reporting of findings. It is possible that test planning does not have to address reporting at all, as long as there is agreement between both agencies. How is the information shared, for example? Is a formal report required, or is a briefing chart sufficient? A spread- sheet with results, or a database?
Test article configuration—This aspect should also be easy to combine. However, the reality is that configuration can change based on how the data support decisions. In particular, it may be desir- able to change the configuration for design and engineering purposes, but to keep it stable or fixed for requirements and mission capability assessment.
Take software updates, a frequent example. There should be a plan as to when updates will occur. If testing reveals the need for an unplanned software update, the teams must come together to determine when to insert it into the schedule and how this unplanned “fix” impacts test- ing: Does it need to start from zero, or can it continue from the cut-in point? If the update adds capability, what is the impact on evaluation of the system?
—MR. HARRY H. JENKINS III
compliance. Depending on the technolo- gy’s maturity—the technology readiness level—high failure rates may be accept- able. But if the technology readiness level is high (e.g., greater than 6 on the stan- dard DOD readiness scale of 1 to 9), then high failure rates could indicate poor qual- ity or design.
In short, bad news does not get better with time. It is always best to test in a robust, realistic way to identify failures early, providing time for correction if necessary, rather than hiding them by test- ing in unrealistic ways that pamper the system. Well-designed systems can oper- ate as intended and do not induce delays in testing, thus satisfying requirements and saving test time and money.
KEEPING IT REAL All parties involved must become comfort- able with the risks of realistic testing. Contractors need to overcome the resis- tance to share data that may be critical of their design, as this early feedback is exactly what the Army T&E community needs. Te Army needs to be receptive to early discovery of issues and provide feedback to the contractor to mature the product and achieve the desired end state. Early discovery minimizes the expense of corrective actions or design changes to mature a concept.
Te PM and ATEC can accept contractor data from nongovernment test facilities, but no single approved process, policy or overall guidance exists to fit every testing scenario. Audits of test sites and reviews of testing procedures and reporting require- ments are necessary to assess each scenario on a case-by-case basis. In some cases in which test data already exist, ATEC will need to assess the pedigree of the data.
Combining government and contractor testing is also important in supporting
114 Army AL&T Magazine October-December 2018
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24 |
Page 25 |
Page 26 |
Page 27 |
Page 28 |
Page 29 |
Page 30 |
Page 31 |
Page 32 |
Page 33 |
Page 34 |
Page 35 |
Page 36 |
Page 37 |
Page 38 |
Page 39 |
Page 40 |
Page 41 |
Page 42 |
Page 43 |
Page 44 |
Page 45 |
Page 46 |
Page 47 |
Page 48 |
Page 49 |
Page 50 |
Page 51 |
Page 52 |
Page 53 |
Page 54 |
Page 55 |
Page 56 |
Page 57 |
Page 58 |
Page 59 |
Page 60 |
Page 61 |
Page 62 |
Page 63 |
Page 64 |
Page 65 |
Page 66 |
Page 67 |
Page 68 |
Page 69 |
Page 70 |
Page 71 |
Page 72 |
Page 73 |
Page 74 |
Page 75 |
Page 76 |
Page 77 |
Page 78 |
Page 79 |
Page 80 |
Page 81 |
Page 82 |
Page 83 |
Page 84 |
Page 85 |
Page 86 |
Page 87 |
Page 88 |
Page 89 |
Page 90 |
Page 91 |
Page 92 |
Page 93 |
Page 94 |
Page 95 |
Page 96 |
Page 97 |
Page 98 |
Page 99 |
Page 100 |
Page 101 |
Page 102 |
Page 103 |
Page 104 |
Page 105 |
Page 106 |
Page 107 |
Page 108 |
Page 109 |
Page 110 |
Page 111 |
Page 112 |
Page 113 |
Page 114 |
Page 115 |
Page 116 |
Page 117 |
Page 118 |
Page 119 |
Page 120 |
Page 121 |
Page 122 |
Page 123 |
Page 124 |
Page 125 |
Page 126 |
Page 127 |
Page 128 |
Page 129 |
Page 130 |
Page 131 |
Page 132 |
Page 133 |
Page 134 |
Page 135 |
Page 136 |
Page 137 |
Page 138 |
Page 139 |
Page 140 |
Page 141 |
Page 142 |
Page 143 |
Page 144